Seminar Details
2025-04-09 (15h) : Stochastic second-order optimization: global bounds, subspaces, and momentum
At Euler building (room A.002)
Speaker :
Doikov, Nikita
Abstract :
In this talk, we present stochastic second-order algorithms for solving general non-convex optimization problems. Using the cubic regularization, we prove global convergence rates for our methods. We will discuss two techniques that improve the properties of our algorithms in large-scale cases: stochastic subspaces, to deal with high-dimensional problems, and stochastic methods with momentum. The latter technique provably improves the variance of stochastic estimates and allows the method to converge for any noise level. This is in stark contrast to all existing stochastic second-order methods for non-convex problems, which typically require large batches.
