Communications and Signal Processing Seminar

Learning via early stopping and untrained neural nets

Mahdi SoltanolkotabiAssistant ProfessorMing Hsieh Department of Electrical and Computer Engineering, University of Southern California

Abstract:  Modern neural networks are typically trained in an over-parameterized regime where the parameters of the model far exceed the size of the training data. Such neural networks in principle have the capacity to (over)fit any set of labels including significantly corrupted ones. Despite this (over)fitting capacity, over-parameterized networks have an intriguing robustness capability: they are surprisingly robust to label noise when first order methods with early stopping are used to train them. Even more surprising, one can remove noise and corruption from a natural image without using any training data what-so-ever, by simply fitting (via gradient descent) a randomly initialized, over-parameterized convolutional generator to a single corrupted image. In this talk I will first present theoretical results aimed at explaining the robustness capability of neural networks when trained via early-stopped gradient descent. I will then present results towards demystifying untrained networks for image reconstruction/restoration tasks such as denoising and those arising in inverse problems such as compressive sensing.

Bio:  Mahdi Soltanolkotabi is an assistant professor in the Ming Hsieh Department of Electrical and Computer Engineering and Computer Science at the University of Southern California where he holds an Andrew and Erna Viterbi Early Career Chair. Prior to joining USC, he completed his PhD in electrical engineering at Stanford in 2014. He was a postdoctoral researcher in the EECS department at UC Berkeley during the 2014-2015 academic year. Mahdi is the recipient of the Information Theory Society Best Paper Award, Packard Fellowship in Science and Engineering, a Sloan Research Fellowship, an NSF Career award, an Airforce Office of Research Young Investigator award (AFOSR-YIP), and a Google faculty research award. His research focuses on developing the mathematical foundations of modern data science with a particular focus on design and mathematical understanding of computationally efficient algorithms for optimization, machine learning, signal processing, high dimensional statistics, computational imaging and artificial intelligence.

Join Zoom Meeting

Meeting ID: 975 9857 1292

Passcode: XXXXXX (Will be sent via email to attendees)


See Full seminar by Prof Soltanolkotabi