Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives

Abstract

Minor modifications in notations and acknowledgments. These methods were originally presented in arXiv:2006.06041v2. Code available at https://github.com/mathbarre/StronglyConvexForwardBackwardInternational audienceIn this short note, we provide a simple version of an accelerated forward-backward method (a.k.a. Nesterov's accelerated proximal gradient method) possibly relying on approximate proximal operators and allowing to exploit strong convexity of the objective function. The method supports both relative and absolute errors, and its behavior is illustrated on a set of standard numerical experiments. Using the same developments, we further provide a version of the accelerated proximal hybrid extragradient method of Monteiro and Svaiter (2013) possibly exploiting strong convexity of the objective function

Similar works

Full text

thumbnail-image

Hal-Diderot

redirect
Last time updated on 11/05/2022

This paper was published in Hal-Diderot.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.