Robert M. Gower
Research Scientist, Flatiron Institute
I am a Research Scientist in the Center for Computational Mathematics at the Flatiron Institute in New York City.
Optimization /Machine learning
My current work sits between optimization theory and machine-learning practice, with a particular taste for simple algorithms that exploit some structure. In order, I have been obsessed with Adam, Polyak stepsizes, Muon and non-Euclidean gradient descent work.
I have also worked on variational inference, randomized numerical linear algebra, automatic differentiation, and quasi-Newton methods.
Updates
Recent News
- Next talk: SIAM Conference on Optimization (OP26), June 2-5, 2026.
- The Polar Express received the ICLR 2026 Outstanding Paper Honorable Mention and was presented as an oral.
- In Search of Adam's Secret Sauce was an oral presentation at NeurIPS 2025.
People
People & Mentoring
Samy JelassiIntern, 2017→ PhD, Princeton
Nidham GazagnadouPhD student, 2018-2021→ Research Scientist, Sony AI
Rui YuanPhD student, 2019-2023→ AI Research Scientist, Stellantis
Si Yi Meng (Cathy)Intern, 2022 and 2023
Fabian SchaippGuest researcher, 2022 & 2023
Slavomír HanzelyIntern, 2022→ Researcher, CISPA
Aaron MishkinIntern, 2023→ Postdoc, EPFL
David PerssonFlatiron Fellow, 2024-
Michael CrawshawIntern, 2025→ Flatiron Fellow, 2026
Tetiana ParshakovaFlatiron Fellow, 2025-
Bibliography powered
Papers
Recorded
Talks
Unpublished
Notes
- A Very Simple Introduction to Diffusion Models and The Standard Loss Function Short note, 2023
- PhD Thesis University of Edinburgh, 2016
- Halley-Chebyshev Challenge Technical note, 2014
- Conjugate Gradients: The Short and Painful Explanation with Oblique Projections Expository note, 2014
- Hessian Matrices via Automatic Differentiation MSc thesis / technical report, 2011
- Efficient Calculation of Derivatives Through Graph Coloring (Portuguese)) Undergraduate project / technical report, 2009