Stochastic Modified Flows for Riemannian Stochastic Gradient Descent,
joint work with Benjamin Gess and Nimit Rana
(Preprint)
On the existence of minimizers in shallow residual ReLU neural network optimization landscapes,
joint work with Steffen Dereich and Arnulf Jentzen
(Preprint)
Convergence rates for momentum stochastic gradient descent with noise of machine learning type,
joint work with Benjamin Gess
(Preprint)
Convergence of stochastic gradient descent schemes for Lojasiewicz-landscapes,
joint work with Steffen Dereich
(Preprint)
On the existence of optimal shallow feedforward networks with ReLU activation,
joint work with Steffen Dereich
J. Mach. Learn. 3(1): 1-22 (2024) (arXiv, PDF)
Stochastic Modified Flows, Mean-Field Limits and Dynamics of Stochastic Gradient Descent,
joint work with Benjamin Gess and Vitalii Konarovskyi
J. Mach. Learn. Res. 25(30):1-27 (2024)
(arXiv, PDF)
Central limit theorems for stochastic gradient descent with averaging for stable manifolds
joint work with Steffen Dereich,
Electron. J. Probab. 28, 1-48 (2023) (arXiv, PDF)
Cooling down stochastic differential equations: almost sure convergence,
joint work with Steffen Dereich
Stochastic Process. Appl. 152, 289-311 (2022) (arXiv, PDF)
On minimal representations of shallow ReLU networks
joint work with Steffen Dereich,
Neural Networks 148, 121-128 (2022) (arXiv, PDF)