Title : Robust Machine Learning

Abstract: Distributed machine learning is essential for handling the computational demands of model training. However, heterogeneous hardware capabilities and the presence of unreliable or malicious devices pose significant challenges. Standard approaches, such as SGD with averaging-based aggregation, fail to achieve their convergence properties when executed in this complex environment. We investigate new distributed optimization algorithms that ensure robustness to hardware heterogeneity and tolerate adversarial devices concurrently, while minimizing the impact on training quality.

Dates

March 11, 2026

Abstract submission deadline

March 18, 2026

Paper submission deadline

April 22, 2026

Accept/Reject notification

June 10-12, 2026

Netys Conference

Proceedings

Partners & Sponsors (TBA)