Talk by Andre Uschmajew : "Gradient sampling for nonsmooth optimization on manifolds and varieties"
May 14 (Mon.)
15:00 - 16:00
National Institute of Informatics
Room 1208, 12th floor
Andre Uschmajew (Max Planck Institute, Leipzig)
Gradient sampling for nonsmooth optimization on manifolds and varieties
Gradient sampling (GS) is a conceptually simple method for minimization of locally Lipschitz functions. The idea is to approximate the subdifferential at a given point by a convex hull of randomly sampled nearby gradients. We consider generalizations of this approach to optimization problems posed on Riemannian manifolds or algebraic subvarieties admitting an a-regular stratification. The latter scenario is motivated by optimization problems on low-rank matrices or tensors. The main theoretical task is to prove that the minimum norm element in a convex hull of vectors, which are obtained by transporting gradients from nearby points to the current tangent space, is a descent direction (w.r.t. a retraction), provided that the number of sampled gradients is larger than the dimension of the manifold/variety. For varieties it is further necessary to discuss what should happen in the singular points. Under reasonable assumptions, we can prove that with probability one the limit points of our proposed GS algorithms are Clarke stationary points relative to their strata. When the dimension of the manifold is too large, the full GS approach can be computationally expensive, but our numerical experiments, which concern the sparse vector problem on a sphere and robust low-rank matrix approximation, illustrate that a small number of gradient samples can be sufficient in practice. Joint work with Seyedehsomayeh Hosseini.
nakatsukasa [at] nii.ac.jp