NEO Seminar: Stefano Rini – Communication-Efficient Federated Learning: Challenges, Techniques, and Insights

Speaker: Stefano Rini, National Yang-Ming Chiao-Tung University (NYCU) in Taiwan

Title: Communication-Efficient Federated Learning: Challenges, Techniques, and Insights

Time and Place: April 25, 2024 at 14h30 in Salle Lagrange Gris, Inria, Sophia-Antipolis

Abstract: Federated Learning (FL) has emerged as a powerful approach to training large models on distributed datasets, offering advantages in terms of data locality, scalability, and privacy. However, practical implementation of FL faces significant challenges, primarily due to the communication constraints between remote learners and the Parameter Server (PS). In this talk, we will provide a comprehensive survey of the current state of communication-efficient federated learning, exploring the various techniques and methodologies that have been proposed to address these challenges.

We will first discuss the practical issues arising from distributed training, focusing on the communication bottleneck between remote learners and the PS. Next, we will delve into the two main classes of gradient compression algorithms: gradient sparsification and gradient quantization. Furthermore, we will explore the role of dimensionality reduction algorithms and error feedback mechanisms in improving training performance.

Our contribution, the M22 algorithm, will be presented in the context of these broader developments. M22, a rate-distortion inspired approach to gradient compression, leverages an M-magnitude weighted L2 distortion measure and 2 degrees of freedom distribution fitting to achieve efficient compression in FL scenarios.

Comments are closed.