Παρακολούθηση
Mher Safaryan
Mher Safaryan
Postdoctoral MSCA Fellow, IST Austria
Η διεύθυνση ηλεκτρονικού ταχυδρομείου έχει επαληθευτεί στον τομέα ist.ac.at - Αρχική σελίδα
Τίτλος
Παρατίθεται από
Παρατίθεται από
Έτος
On Biased Compression for Distributed Learning
A Beznosikov, S Horváth, P Richtárik, M Safaryan
Journal of Machine Learning Research (JMLR), 2023, 2020
1842020
FedNL: Making Newton-type methods applicable to federated learning
M Safaryan, R Islamov, X Qian, P Richtárik
International Conference on Machine Learning (ICML), 2022, 2021
832021
Optimal gradient compression for distributed and federated learning
A Albasyoni, M Safaryan, L Condat, P Richtárik
arXiv preprint arXiv:2010.03246, 2020
592020
Uncertainty principle for communication compression in distributed and federated learning and the search for an optimal compressor
M Safaryan, E Shulgin, P Richtárik
Information and Inference: A Journal of the IMA, 2021, 2020
582020
Stochastic Sign Descent Methods: New Algorithms and Better Theory
M Safaryan, P Richtárik
International Conference on Machine Learning (ICML), 2021, 2019
53*2019
Smoothness matrices beat smoothness constants: Better communication compression techniques for distributed optimization
M Safaryan, F Hanzely, P Richtárik
Advances in Neural Information Processing Systems (NeurIPS) 34, 25688-25702, 2021
302021
Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning
X Qian, R Islamov, M Safaryan, P Richtárik
International Conference on Artificial Intelligence and Statistics (AISTATS …, 2021
222021
Construction of free g-dimonoids
Y Movsisyan, S Davidov, M Safaryan
Algebra and discrete mathematics, 2014
182014
Theoretically Better and Numerically Faster Distributed Optimization with Smoothness-Aware Quantization Techniques
B Wang, M Safaryan, P Richtárik
Advances in Neural Information Processing Systems (NeurIPS) 2022, 2021
14*2021
Distributed Newton-type methods with communication compression and bernoulli aggregation
R Islamov, X Qian, S Hanzely, M Safaryan, P Richtárik
Transactions on Machine Learning Research (TMLR), 2023, 2022
112022
Gradskip: Communication-accelerated local gradient methods with better computational complexity
A Maranjyan, M Safaryan, P Richtárik
arXiv preprint arXiv:2210.16402, 2022
102022
On generalizations of Fatou’s theorem for the integrals with general kernels
GA Karagulyan, MH Safaryan
The Journal of Geometric Analysis 25, 1459-1475, 2015
102015
AsGrad: A Sharp Unified Analysis of Asynchronous-SGD Algorithms
R Islamov, M Safaryan, D Alistarh
International Conference on Artificial Intelligence and Statistics (AISTATS …, 2023
92023
On Generalizations of Fatou’s Theorem in for Convolution Integrals with General Kernels
MH Safaryan
The Journal of Geometric Analysis 31 (4), 3280-3299, 2021
72021
On a theorem of Littlewood
GA Karagulyan, MH Safaryan
Hokkaido Mathematical Journal 46 (1), 87-106, 2017
62017
On an equivalence for differentiation bases of dyadic rectangles
GA Karagulyan, DA Karagulyan, MH Safaryan
Colloq. Math 146 (2), 295-307, 2017
62017
On an equivalency of rare differentiation bases of rectangles
MH Safaryan
Journal of Contemporary Mathematical Analysis (Armenian Academy of Sciences …, 2018
52018
MicroAdam: Accurate Adaptive Optimization with Low Space Overhead and Provable Convergence
IV Modoranu, M Safaryan, G Malinovsky, E Kurtic, T Robert, P Richtarik, ...
Advances in Neural Information Processing Systems (NeurIPS) 2024, 2024
42024
Knowledge Distillation Performs Partial Variance Reduction
M Safaryan, A Peste, D Alistarh
Advances in Neural Information Processing Systems (NeurIPS) 2023, 2023
32023
LDAdam: Adaptive Optimization from Low-Dimensional Gradient Statistics
T Robert, M Safaryan, IV Modoranu, D Alistarh
arXiv preprint arXiv:2410.16103, 2024
2024
Δεν είναι δυνατή η εκτέλεση της ενέργειας από το σύστημα αυτή τη στιγμή. Προσπαθήστε ξανά αργότερα.
Άρθρα 1–20