Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

Global rates of convergence for nonconvex optimization on manifolds

Abstract

We consider the minimization of a cost function f on a manifold M using Riemannian gradient descent and Riemannian trust regions (RTR). We focus on satisfying necessary optimality conditions within a tolerance ε. Specifically, we show that, under Lipschitz-type assumptions on the pullbacks of f to the tangent spaces of M, both of these algorithms produce points with Riemannian gradient smaller than ε in O(1/ε2) iterations. Furthermore, RTR returns a point where also the Riemannian Hessian's least eigenvalue is larger than −ε in O(1/ε3) iterations. There are no assumptions on initialization. The rates match their (sharp) unconstrained counterparts as a function of the accuracy ε (up to constants) and hence are sharp in that sense. These are the first general results for global rates of convergence to approximate first- and second-order KKT points on manifolds. They apply in particular for optimization constrained to compact submanifolds of Rn, under simpler assumptions

Similar works

Full text

thumbnail-image

DIAL UCLouvain

redirect
Last time updated on 23/09/2018

This paper was published in DIAL UCLouvain.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.