Title : Robust Machine Learning
Abstract: Distributed machine learning is essential for handling the computational demands of model training. However, heterogeneous hardware capabilities and the presence of unreliable or malicious devices pose significant challenges. Standard approaches, such as SGD with averaging-based aggregation, fail to achieve their convergence properties when executed in this complex environment. We investigate new distributed optimization algorithms that ensure robustness to hardware heterogeneity and tolerate adversarial devices concurrently, while minimizing the impact on training quality.
Dates
March 1st, 2025 → March 15th, 2025
Abstract submission deadline
March 8th, 2025 → March 15th, 2025
Paper submission deadline
April 14th ,2025
Accept/Reject notification
May 21-23 ,2025
Netys Conference