Communications and Signal Processing Seminar
Learning is Pruning
This event is free and open to the publicAdd to Google Calendar
Abstract: The strong lottery ticket hypothesis (LTH) postulates that any neural network can be approximated by simply pruning a sufficiently larger network of random weights. Recent work establishes that the strong LTH is true if the random network to be pruned is a large poly-factor wider than the target one. This polynomial over-parameterization is at odds with experimental research that achieves good approximation by pruning networks that are only a small factor wider than the target one. In this talk, I will explain how we close this gap and offer an exponential improvement to the over-parameterization requirement. I will give a sketch of the proof that any target network can be approximated by pruning a random one that is only a logarithmic factor wider. This is possible by establishing a connection between pruning random ReLU networks and random instances of the SubsetSum problem. Our work indicates the existence of a universal and striking phenomenon: neural network training is equivalent to pruning slightly overparameterized networks of random weights. I will conclude with sharing hints of a general framework indicating the existence of good pruned networks for a variety of activation functions, architectures, even applicable for the case where both initialization weights and activations are binary.
Speaker Bio: Dimitris Papailiopoulos is an Assistant Professor of Electrical and Computer Engineering at the University of Wisconsin Madison, a faculty fellow of the Grainger Institute for Engineering, and a faculty affiliate at the Wisconsin Institute for Discovery. His research interests span machine learning, information theory, and distributed systems, with a current focus on efficient large-scale training algorithms and coding-theoretic techniques for robust machine learning. Between 2014 and 2016, Dimitris was a postdoctoral researcher at UC Berkeley and a member of the AMPLab. He earned his Ph.D. in ECE from UT Austin in 2014, under the supervision of Alex Dimakis. In 2007 he received his ECE Diploma and in 2009 his M.Sc. degree from the Technical University of Crete, in Greece. Dimitris is a recipient of the NSF CAREER Award (2019), two Sony Faculty Innovation Awards (2019 and 2020), a joint IEEE ComSoc/ITSoc Best Paper Award (2020), an IEEE Signal Processing Society, Young Author Best Paper Award (2015), the Vilas Associate Award (2021), the Emil Steiger Distinguished Teaching Award (2021), and the Benjamin Smith Reynolds Award for Excellence in Teaching (2019). In 2018, he co-founded MLSys, a new conference that targets research at the intersection of machine learning and systems. In 2018 and 2020 he was program co-chair for MLSys, and in 2019 he co-chaired the 3rd Midwest Machine Learning Symposium.
Join Zoom Meeting https://umich.zoom.us/j/97598571292
Meeting ID: 975 9857 1292
Passcode: XXXXXX (Will be sent via email to attendees)
Zoom Passcode information is also available upon request to Shelly (Michele) Feldkamp ([email protected]).