CORE Seminar: James ReneĀgar
Topic
Speakers
Details
The study of first-order methods has largely dominated research in continuous optimization for the last decade, yet still the range of problems for which efficient and easily-applicable first-order methods have been developed is surprisingly limited, even though much has been achieved in some areas with high profile, such as compressed sensing.
We present a simple transformation of any linear or semi definite (or hyperĀbolic) program into an equivalent convex optimization problem whose only constraints are linear equations. The objective function is defined on the whole space, making virtually all sub gradient methods be immediately applicable. We observe, moreover, that the objective function is naturally smoothed, thereby allowing most first-order methods to be applied.
We develop complexity bounds in the unsmoothed case for a particular sub gradient method, and in the smoothed case for Nesterovās original optimal first-order method for smooth functions. We achieve the desired bounds on the number of iterations.
Perhaps most surprising is that the transformation is simple and so is the basic theory, and yet the approach has been overlooked until now, a blind spot.
Additional Information
James ReneĀgar, CorĀnell