Communications and Signal Processing Seminar

Optimized first-order convex minimization methods

Jeff FesslerProfessorUniversity of Michigan, Department of Electrical Engineering and Computer Science
SHARE:

Many problems in signal and image processing, machine learning, and estimation require optimization of convex cost functions. For convex cost functions with Lipschitz continuous gradients, Nesterov's fast gradient method decreases the cost function at least as fast as the square of the number of iterations, a rate order that is optimal. This talk presents a new first-order convex optimization method that converges twice as fast yet has a remarkably simple implementation that is comparable to Nesterov's method. This is work by doctoral student Donghwan Kim.

Sponsored by

University of Michigan, Department of Electrical Engineering & Computer Science