Florea, Mihai
[UCL]
The Inexact Gradient Method with Memory (IGMM) is able to considerably outperform the Gradient Method by employing a piecewise linear lower model on the smooth part of the objective. However, this model cannot be solved exactly and IGMM relies on an inaccuracy term \delta. The need for a bound on inexactness narrows the range of problems to which IGMM can be applied. In addition, \delta carries over to the worst-case convergence rate. In this work, we show how a simple modification of IGMM eliminates the reliance on \delta for convergence. The resulting Exact Gradient Method with Memory (EGMM) is as broadly applicable as the Bregman Distance Gradient Method (NoLips) and has a worst-case rate of O(1/k), recently shown to be optimal for its class. Moreover, the elimination of \delta allows us to accelerate EGMM without error accumulation, yielding an Accelerated Gradient Method with Memory (AGMM) possessing a worst-case rate of O(1/k^2) on the largest subclass of problems for which acceleration is known to be attainable. Preliminary computational experiments show that the flexibility of our model enables EGMM to surpass IGMM in practical performance. The convergence speed of AGMM also consistently exceeds that of FGM, even with small bundles.
![](https://dial.uclouvain.be/pr/boreal/sites/all/modules/dial/dial_user/dial_user_list/images/shopping-basket-gray--plus.png)
![](https://dial.uclouvain.be/pr/boreal/sites/all/modules/dial/dial_widget/dial_widget_pr/images/icons/printer.png)
Bibliographic reference |
Florea, Mihai. Exact gradient methods with memory. CORE Discussion Papers ; 2019/26 (2019) 23 pages |
Permanent URL |
http://hdl.handle.net/2078.1/223944 |