Enhancing Federated Learning Through Low-Rank Adaptation and Singular Value Decomposition
Date of Award
4-30-2024
Document Type
Thesis
Degree Name
Master of Science in Machine Learning
Department
Machine Learning
First Advisor
Dr. Samuel Horvath
Second Advisor
Dr. Martin Takac
Abstract
"This thesis investigates an innovative direction in the field of federated learning (FL) by enhancing model communication efficiency and handling data heterogeneity through the integration of Low-Rank Adaptation (LoRA) and Singular Value Decomposition (SVD) techniques. As distributed data sources surge, especially in today's era where privacy preservation is increasingly emphasized, FL offers an effective framework for training models without centralizing data. However, one of the main challenges FL faces is efficiently handling non-independent and identically distributed (non-IID) data while minimizing the communication overhead during model updates. To address these challenges, this study proposes a federated learning method that combines truncated SVD with LoRA. This method aims to optimize the model parameter update and transmission process through low-rank matrix techniques, thereby reducing communication costs and enhancing model adaptability. Experimental validation on the MNIST and CIFAR-10 datasets demonstrates the effectiveness of the proposed method in dealing with data heterogeneity and compares its performance with traditional FL algorithms, such as FedAvg and FedProx. The results indicate that, compared to standard FL algorithms, our method shows improved performance under various settings, especially in environments with highly heterogeneous data distributions. Moreover, by adjusting the degree of low-rank adaptation, our method can find an optimal balance between model complexity and accuracy across different data distribution scenarios, thus mitigating the risk of overfitting and enhancing the model's generalizability. The contribution of this research lies in proposing a FL framework that combines SVD and LoRA, offering a new perspective and approach for model optimization in federated learning. This method not only provides solutions to the current challenges of communication efficiency and data heterogeneity in FL but also opens up a new pathway for optimizing distributed machine learning models using low-rank matrix techniques."
Recommended Citation
Y. Chen, "Enhancing Federated Learning Through Low-Rank Adaptation and Singular Value Decomposition,", Apr 2024.
Comments
Thesis submitted to the Deanship of Graduate and Postdoctoral Studies
In partial fulfilment of the requirements for the M.Sc degree in Machine Learning
Advisors: Samuel Horvath, Dr. Martin Takac
Online access available for MBZUAI patrons