Abstract:
Distribution Learning is a form of machine learning which involves the distribution of data across in a network having multiple devices. Each node is trained independently, and then combined the results to generate a model which is global model. This greatly accelerates and parallelizes the training process and optimizing time whereas, federated learning is a secure distribution machine learning paradigm which is specifically designed for privacy-sensitive scenarios where the data stays on local devices of users, and models are trained on the their devices itself without the sharing of the data with the central server. To enhance the capabilities of Federated Learning, Mixture Density Networks are integrated into the federated learning environment. MDNs are neural networks that can simulate complex probability distributions and measure prediction uncertainty. In this study, MDN is implemented in three distinct Federated Learning settings and compares their performance. The primary objectives of this research are to investigate the use of MDNs in distribution learning and evaluate the effectiveness of federated learning. The experiment focuses specifically on exploring the federated learning approach for distribution learning, with a direct emphasis on the distribution itself rather than other properties that may affect the data