Preprints

  • Stochastic Modified Flows for Riemannian Stochastic Gradient Descent,
    joint work with Benjamin Gess and Nimit Rana
    (Preprint)

  • On the existence of optimal shallow feedforward networks with ReLU activation,
    joint work with Steffen Dereich
    accepted by J. Mach. Learn.
    (Preprint)

  • On the existence of minimizers in shallow residual ReLU neural network optimization landscapes,
    joint work with Steffen Dereich and Arnulf Jentzen
    (Preprint)

  • Stochastic Modified Flows, Mean-Field Limits and Dynamics of Stochastic Gradient Descent,
    joint work with Benjamin Gess and Vitalii Konarovskyi
    accepted by J. Mach. Learn. Res.
    (Preprint)

  • Convergence rates for momentum stochastic gradient descent with noise of machine learning type,
    joint work with Benjamin Gess
    (Preprint)

  • Convergence of stochastic gradient descent schemes for Lojasiewicz-landscapes,
    joint work with Steffen Dereich
    (Preprint)

Publications

  • Central limit theorems for stochastic gradient descent with averaging for stable manifolds
    joint work with Steffen Dereich,
    Electron. J. Probab. 28, 1-48 (2023) (arXiv, PDF)

  • Cooling down stochastic differential equations: almost sure convergence,
    joint work with Steffen Dereich
    Stochastic Process. Appl. 152, 289-311 (2022) (arXiv, PDF)

  • On minimal representations of shallow ReLU networks
    joint work with Steffen Dereich,
    Neural Networks 148, 121-128 (2022) (arXiv, PDF)