Computing and Mathematical Sciences Colloquium
Subgradient Methods for General Convex Optimization
Subgradient methods are the most basic of algorithms in convex optimization, applicable quite generally and having simple convergence proofs, modulo assumptions such as orthogonal projections onto convex sets being readily computable. The fundamentals regarding subgradient methods have changed little in decades, perhaps in part due a supposition by researchers that not much else can be said for algorithms relying on hardly any problem structure other than convexity and Lipschitz continuity. We explain that, to the contrary, further advances are possible. Under the assumption that a strictly feasible point is known, we display an elementary algorithmic device which removes the need for Lipschitz continuity, and that replaces orthogonal projections with line searches.
Contact: Carmen Nemer-Sirois at (626) 395-4561 email@example.com