Abstract:
In today’s world, Machine learning and deep learning has changed the vision in CS/IT fields. It is continuously exciting us in different ways, giving us amazing results and solving many problems in different fields. With the drastic increase in Deep Learning, data privacy is becoming a great issue. To overcome this data privacy issue, Federated Learning helps us in training sensitive data without actually sharing the data to the Machine Learning Models. In Federated Learning, all devices are required to be directly connected to the central server which means multiple clients train a shared model by exchanging model-related parameters with a central server. Federated learning refers to distributed learning protocols in which the training data is not exchanged directly. In this project, results which are already presented in several research paper are implemented. Further, novel Robust Federated Learning is being studied. We have implemented the Federated Learning in Wireless Setting in the presence of Additive White Gaussian Noise. If the noise present in the channel is not AWGN noise or the noise is modelled in such as way that it is non-Gaussian Noise, then how the fed Avg will behave in this scenario. We have proposed a new Power Control Algorithm which is showing the effect of noise on Federated Learning. We have seen that Federated Learning in Wireless Setting in the presence of impulsive noise is not working. So we tried the fedprox algorithm and it is seen that fedprox provides more robust convergence that Fedavg.