Mathematics of Data Science Seminar - Paul Laiu

Ohio State Garden of Constants
April 17, 2025
11:30 am - 12:20 pm
Math Tower (MW) 154

Date Range
2025-04-17 11:30:00 2025-04-17 12:20:00 Mathematics of Data Science Seminar - Paul Laiu Paul LaiuOak Ridge National LaboratoryTitleAccelerated and communication efficient federated learning algorithmsAbstractFederated learning (FL) is a distributed machine learning paradigm that enables multiple local devices, i.e., clients, and a central server to collaboratively train a machine learning model from local data distributed on clients without moving the data. Common challenges in FL include data privacy preservation, communication efficiency, and convergence guarantees. In this talk, we present two new FL algorithms, FedOSAA and FeDLRT. FedOSAA leverages a widely used acceleration scheme, Anderson acceleration, to accelerate the convergence of standard FL schemes with minimal additional communication costs. FeDLRT adopts a dynamical low-rank training scheme to constrain the model training on low rank factors of model parameters, resulting in savings in both the communication and memory costs in the training process. By incorporating a variance reduction scheme, we establish convergence guarantees for both FL algorithms, even when the data distribution is heterogeneous (non-IID) among the local clients. The advantages of these two FL algorithms are demonstrated in numerical tests on machine learning problems ranging from logistic regression to training transformer models in an FL setting.  Math Tower (MW) 154 America/New_York public

Paul Laiu
Oak Ridge National Laboratory

Title
Accelerated and communication efficient federated learning algorithms

Abstract
Federated learning (FL) is a distributed machine learning paradigm that enables multiple local devices, i.e., clients, and a central server to collaboratively train a machine learning model from local data distributed on clients without moving the data. Common challenges in FL include data privacy preservation, communication efficiency, and convergence guarantees. In this talk, we present two new FL algorithms, FedOSAA and FeDLRT. FedOSAA leverages a widely used acceleration scheme, Anderson acceleration, to accelerate the convergence of standard FL schemes with minimal additional communication costs. FeDLRT adopts a dynamical low-rank training scheme to constrain the model training on low rank factors of model parameters, resulting in savings in both the communication and memory costs in the training process. By incorporating a variance reduction scheme, we establish convergence guarantees for both FL algorithms, even when the data distribution is heterogeneous (non-IID) among the local clients. The advantages of these two FL algorithms are demonstrated in numerical tests on machine learning problems ranging from logistic regression to training transformer models in an FL setting.
 

Events Filters: