R. M. G., Dirk A. Lorenz, Maximilian Winkler
A Bregman-Kaczmarz method for nonlinear systems of equations, arXiv 2303.08549, 2023
Preprint
Guillaume Garrigos, Robert M. Gower
Handbook of Convergence Theorems for (Stochastic) Gradient Methods, arXiv:2301.11235 2023.
Preprint
Rui Yuan, Simon S. Du, R. M. G.
Alessandro Lazaric, Lin Xiao
Linear Convergence of Natural Policy Gradient Methods
with Log-Linear Policies, ICLR 2023.
Preprint
Proceedings
Shuang Li, William J. Swartworth, Martin Takáč, Deanna Needell, R. M. G.
SP2 : A Second Order Stochastic Polyak Method, ICLR 2023.
Preprint
Proceedings
R. M. G. , Mathieu Blondel, Nidham Gazagnadou, Fabian Pedregosa.
Cutting Some Slack for SGD with Adaptive Polyak Stepsizes, 2022.
Preprint
R. M. G. , Aaron Defazio, Michael Rabbat.
Stochastic Polyak Stepsize with a Moving Target, 2021.
Preprint
Rui Yuan, R. M. G., Alessandro Lazaric.
A general sample complexity analysis of vanilla policy gradient, AISTATS 2022.
Preprint
Proceedings
Jiabin Chen, Rui Yuan, Guillaume Garrigos, R. M. G.
SAN: Stochastic Average Newton Algorithm for Minimizing Finite Sums, AISTATS 2022.
Preprint
Code
Proceedings
Nidham Gazagnadou, Mark Ibrahim, R. M. G.
RidgeSketch: A Fast sketching based solver for large scale ridge regression, SIAM Journal of Matrix Analysis,
Vol. 43, Iss. 3, 2022.
Preprint
Journal
Code
R. M. G., Mark Schmidt, Francis Bach, Peter Richtarik
Variance-Reduced Methods for Machine Learning, Proceedings of the IEEE, vol. 108, no. 11, pp. 1968-1983, Nov. 2020.
Preprint
Journal
Rui Yuan, Alessandro Lazaric, R. M. G.
Sketched Newton-Raphson, SIAM Journal of Optimization, Vol. 32, Iss. 3, 2022.
Preprint
Journal
Ahmed Khaled, Othmane Sebbouh, Nicolas Loizou, R. M. G., Peter Richtárik.
Unified Analysis of Stochastic Gradient Methods for Composite Convex and Smooth Optimization, 2020.
Preprint
R. M. G., Othmane Sebbouh, Nicolas Loizou
SGD for Structured Nonconvex Functions: Learning Rates, Minibatching and Interpolation, AISTATS 2021.
Preprint
Proceedings
Aaron Defazio, R. M. G.
Factorial Powers for Stochastic Optimization, Asian Conference on Machine Learning, 2021.
Preprint
Proceedings
Othmane Sebbouh, R. M. G., Aaron Defazio.
Almost sure convergence rates for Stochastic Gradient Descent and
Stochastic Heavy Ball, COLT 2021.
Preprint
Proceedings
Dmitry Kovalev, R. M. G., Peter Richtárik, Alexander Rogozin.
Fast Linear Convergence of Randomized BFGS, 2020.
Preprint
R. M. G., Denali Molitor, Jacob Moorman, Deanna Needell.
Adaptive Sketch-and-Project Methods for Solving Linear Systems, SIAM Journal on Matrix Analysis and Applications, 2019.
Preprint
Journal
O. Sebbouh, N. Gazagnadou, S. Jelassi, F. Bach, R. M. G.
Towards closing the gap between the theory and practice of SVRG, Neurips 2019.
Preprint
Code
Poster
Proceedings
R. M. G., N. Loizou, X. Qian, A. Sailanbayev, E. Shulgin, P. Richtárik.
SGD: general analysis and improved rates, (extended oral presentation) ICML 2019.
Preprint
Proceedings
A. Bibi, A. Sailanbayev, B. Ghanem, R. M. G. and P. Richtárik.
Improving SAGA via a probabilistic interpolation with gradient descent, 2018.
Preprint
B. K. Abid and R. M. G..
Greedy stochastic algorithms for entropy-regularized optimal transport problems, AISTATS, 2018.
Preprint
Proceedings
Poster
R. M. G. and P. Richtárik.
Randomized quasi-Newton updates are linearly convergent matrix inversion algorithms, SIAM Journal on Matrix Analysis and Applications, 2017.
Preprint
Code
Journal
Slides
R. M. G.
Sketch and Project: Randomized Iterative Methods for Linear Systems and Inverting Matrices, PhD Dissertation, School of Mathematics, The University of Edinburgh, 2016.
Preprint
Code
Slides
R. M. G. and M. P. Mello.
Computing the sparsity pattern of Hessians using automatic differentiation, ACM Transactions on Mathematical Software, 2014.
Preprint
Journal
Code
R. M. G. and M. P. Mello.
A new framework for Hessian automatic differentiation, Optimization Methods and Software, 2012.
Preprint
Journal
Slides
Code