Faculty Candidate Seminar
Fast Randomized Algorithms for Convex Optimization
Add to Google Calendar
With the advent of massive data sets, statistical learning and information processing techniques are expected to enable unprecedented possibilities for better decision making. However, existing algorithms for mathematical optimization, which are the core component in these techniques, often prove ineffective for scaling to the extent of all available data. This talk introduces our recent work on random projection methods in the context of general convex optimization problems to address this challenge. First, we establish the theoretical relation between complexity and optimality of solutions. Then, we provide a general information-theoretic lower bound on any method that is based on random projection, which surprisingly shows that the most widely used form of random projection is, in fact, statistically sub-optimal. We finally present a novel method, which iteratively refines the solutions to achieve statistical optimality and generalize our method to optimizing arbitrary convex functions of a large data matrix. The proposed method, called the Newton Sketch, is a faster randomized version of the well-known Newton's Method with linear computational complexity in the input data. We show that Newton's sketch enables solving large scale optimization and statistical inference problems orders-of-magnitude faster than existing methods. Moreover, due to the special structure of random projection, it is possible to speed up computation even further using dedicated hardware implementations such as graphical processing units (GPUs).
Mert Pilanci is a PhD candidate in the Department of Electrical Engineering and Computer Sciences at University of California, Berkeley under the joint supervision of Prof. Martin Wainwright and Prof. Laurent El Ghaoui. He received the B.Sc. and M.S. degrees both in electrical and electronics engineering from Bilkent University, Ankara, Turkey, in 2008 and 2010. He is a recipient of Microsoft Research PhD Fellowship. His research interests are convex optimization, machine learning, statistics, information theory and signal processing.