secretaire-inma@uclouvain.be +32 10 47 80 36

Seminar Details

Home > Seminars > Details

2025-08-20 (14h) : Efficient Distance-Adaptive Subgradient Methods

At Euler building (room A.002)

Speaker : Anton Rodomanov (CISPA, Germany)
Abstract : Subgradient methods based on the idea of gradient normalization are an appealing class of algorithms for convex optimization because they automatically adapt to the local growth rate of the objective and require no problem-specific parameters---except for a reasonably accurate estimate of the initial distance to the solution. Overestimating or underestimating this distance can, however, substantially slow convergence. We address this limitation by incorporating into normalization-based methods a recently proposed technique that dynamically estimates the distance via the displacement of iterates from the starting point. This removes the need for any problem-specific information, while preserving the adaptability of the base methods and ensuring rigorous convergence guarantees. Illustrative examples include nonsmooth Lipschitz functions, Lipschitz- or Hölder-smooth functions, functions with high-order Lipschitz derivatives, quasi-self-concordant functions, and $(L_0, L_1)$-smooth functions. We further extend the approach to problems with functional constraints using a simple switching strategy.
← Back to Seminars