Guanghui Lan, University of Florida

Title

Conditional Gradient Sliding for Convex Optimization

Abstract

We introduce a new conditional gradient type method for convex optimization by utilizing a linear optimization oracle to minimize a series of linear functions over the feasible set.

Different from the classical conditional gradient method, this algorithm can skip the computation of gradients from time to time, and as a result, achieve the optimal rate of convergence in terms of not only the number of calls to linear optimization oracle, but also the first-order oracle. We also develop stochastic variants of this algorithm for solving stochastic optimization problems in an optimal manner. To the best of our knowledge, this is the first time that optimal projection free (stochastic) first-order method of this type has been presented in the literature.

(This is a joint work with Y. Zhou)


« Back