Skip to main content
Extensions of Dynamic Programming, Machine Learning, Discrete Optimization
TREES
Extensions of Dynamic Programming, Machine Learning, Discrete Optimization
Main navigation
Home
People
All Profiles
Principal Investigators
Postdoctoral Fellows
Students
Alumni
Former Members
Events
All Events
Events Calendar
News
Teaching
Collaborators
Books
Contact Us
EF21-P
EF21-P and friends: Improved theoretical communication complexity for distributed optimization with bidirectional compression
Feb 6, 12:00
-
13:00
B9 L2 H2 H2
EF21-P
distributed optimization
gradient descent operation
In this work we focus our attention on distributed optimization problems in the context where the communication time between the server and the workers is non-negligible. We obtain novel methods supporting bidirectional compression (both from the server to the workers and vice versa) that enjoy new state-of-the-art theoretical communication complexity for convex and nonconvex problems.