Seminar Details
2025-10-07 (14:00) : Communication-efficient distributed optimization algorithms
At Euler building (room A.002)
Organized by Mathematical Engineering
Speaker :
Laurent Condat (King Abdullah University of Science and Technology (KAUST))
Abstract :
In distributed optimization and machine learning, a large number of machines perform computations in parallel and communicate back and forth with a server. In particular, in federated learning, the distributed training process is run on personal devices such as mobile phones. In this context, communication, that can be slow, costly and unreliable, forms the main bottleneck. To reduce it, two strategies are popular: 1) local training, which consists in communicating less frequently; 2) compression. Also, a robust algorithm should allow for partial participation. I will present several randomized algorithms we developed recently, with proved convergence guarantees and accelerated complexity. Our most recent paper “LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression” has been presented as a Spotlight at the ICLR conference in April 2025.
