Stochastic Modified Flows for Riemannian Stochastic Gradient Descent,
joint work with Benjamin Gess and Nimit Rana
(Preprint) accepted by SIAM J. Control Optim.
On the existence of minimizers in shallow residual ReLU neural network optimization landscapes,
joint work with Steffen Dereich and Arnulf Jentzen
(Preprint) accepted by SIAM J. Numer. Anal.
Convergence rates for momentum stochastic gradient descent with noise of machine learning type,
joint work with Benjamin Gess
(Preprint)
Convergence of Stochastic Gradient Descent Schemes for Lojasiewicz-Landscapes,
joint work with Steffen Dereich
J. Mach. Learn. (2024) to appear (arXiv, PDF)
On the Existence of Optimal Shallow Feedforward Networks with ReLU Activation,
joint work with Steffen Dereich
J. Mach. Learn. 3(1): 1-22 (2024) (arXiv, PDF)
Stochastic Modified Flows, Mean-Field Limits and Dynamics of Stochastic Gradient Descent,
joint work with Benjamin Gess and Vitalii Konarovskyi
J. Mach. Learn. Res. 25(30):1-27 (2024)
(arXiv, PDF)
Central limit theorems for stochastic gradient descent with averaging for stable manifolds,
joint work with Steffen Dereich
Electron. J. Probab. 28, 1-48 (2023) (arXiv, PDF)
Cooling down stochastic differential equations: almost sure convergence,
joint work with Steffen Dereich
Stochastic Process. Appl. 152, 289-311 (2022) (arXiv, PDF)
On minimal representations of shallow ReLU networks,
joint work with Steffen Dereich
Neural Networks 148, 121-128 (2022) (arXiv, PDF)