secretaire-inma@uclouvain.be +32 10 47 80 36

Seminar Details

Home > Seminars > Details

2024-01-30 (14h) : Gradient Methods with Memory featuring quadratic functional growth estimation for minimizing composite functions

At Euler building (room A.002)

Organized by Mathematical Engineering

Speaker : Mihai Florea (INMA,UCLouvain)
Abstract : The recently introduced Gradient Methods with Memory use a subset of the past oracle information to create a model of the objective function, whose accuracy enables them to surpass the traditional Gradient Methods in practical performance. The model introduces an overhead that is substantial, unless dealing with smooth unconstrained problems. In this work, we introduce several Gradient Methods with Memory that can solve composite problems efficiently, including unconstrained problems with non-smooth objectives. The inexactness of the auxiliary problem does not degrade the convergence guarantees. Actually, we dynamically increase the guarantees as to provably surpass those of their memory-less counterparts. These strengths are preserved when applying acceleration and the containment of inexactness further prevents error accumulation. Our methods are able to estimate key geometry parameters to attain state-of-the-art worst-case rates on many important subclasses of composite problems, where the objective smooth part satisfies a strong convexity condition or a relaxation thereof. In particular, we formulate a restart strategy applicable to optimization methods with sublinear convergence guarantees of any order. Preliminary computational results on a synthetic benchmark of signal processing and machine learning problems show that the combined use of memory and dynamic estimation of quadratic functional growth attains 9 digits of objective accuracy in fewer than 200 iterations, even when strong convexity is absent. This research was mostly performed in the Department of Electronics and Computers, Transilvania University of Brasov, Romania.
← Back to Seminars