From local SGD to local fixed-point methods for federated learning G Malinovskiy, D Kovalev, E Gasanov, L Condat, P Richtarik International Conference on Machine Learning, 6692-6701, 2020 | 139 | 2020 |
Lower bounds and optimal algorithms for smooth and strongly convex decentralized optimization over time-varying networks D Kovalev, E Gasanov, A Gasnikov, P Richtarik Advances in Neural Information Processing Systems 34, 22325-22335, 2021 | 46 | 2021 |
3PC: Three point compressors for communication-efficient distributed training and a better theory for lazy aggregation P Richtárik, I Sokolov, E Gasanov, I Fatkhullin, Z Li, E Gorbunov International Conference on Machine Learning, 18596-18648, 2022 | 32 | 2022 |
FLIX: A simple and communication-efficient alternative to local methods in federated learning E Gasanov, A Khaled, S Horváth, P Richtárik arXiv preprint arXiv:2111.11556, 2021 | 32 | 2021 |
Stochastic spectral and conjugate descent methods D Kovalev, P Richtárik, E Gorbunov, E Gasanov Advances in Neural Information Processing Systems 31, 2018 | 15 | 2018 |
Adaptive compression for communication-efficient distributed training M Makarenko, E Gasanov, R Islamov, A Sadiev, P Richtárik arXiv preprint arXiv:2211.00188, 2022 | 8 | 2022 |
Understanding progressive training through the framework of randomized coordinate descent R Szlendak, E Gasanov, P Richtarik International Conference on Artificial Intelligence and Statistics, 2161-2169, 2024 | 1 | 2024 |
Error feedback reloaded: From quadratic to arithmetic mean of smoothness constants P Richtárik, E Gasanov, K Burlachenko arXiv preprint arXiv:2402.10774, 2024 | 1 | 2024 |
Speeding up Stochastic Proximal Optimization in the High Hessian Dissimilarity Setting E Gasanov, P Richtarik arXiv preprint arXiv:2412.13619, 2024 | | 2024 |
Error Feedback Shines when Features are Rare P Richtarik, E Gasanov, K Burlachenko arXiv preprint arXiv:2305.15264, 2023 | | 2023 |
A New Randomized Method for Solving Large Linear Systems E Gasanov, V Elsukov, P Richtárik | | |