Virtual Seminar: The Role of Regularization in Overparameterized Neural Networks

Friday, October 16, 2020

Overparameterized neural networks have proved to be remarkably successful in many complex tasks such as image classification and deep reinforcement learning. In this talk, we will consider the role of explicit regularization in training overparameterized neural networks. Specifically, we consider ReLU networks and show that the landscape of commonly used regularized loss functions have the property that every local minimum has good memorization and regularization performance. Joint work with Shiyu Liang and Ruoyu Sun.

Time: 11:00 AM – 12:00 PM Central (CDT; UTC -5) 

Access: Seminar was delivered live via Zoom on Friday, October 16. You can watch a video recording of the talk on our YouTube channel here.


Photo: Rayadurgam Srikant
University of Illinois at Urbana-Champaign

R. Srikant is one of two Co-Directors of the Digital Transformation Institute, jointly headquartered at UIUC and Berkeley, which is a consortium of universities (Stanford, MIT, CMU, UChicago, Princeton, Berkeley and UIUC) and industries ( and Microsoft) aimed at promoting research on AI, ML. IoT and cloud computing for the betterment of society. He is also the Fredric G. and Elizabeth H. Nearing Endowed Professor of ECE and the Coordinated Science Lab at UIUC. His research interests are in machine learning, AI, communication networks and applied probability. He is the winner of the 2019 IEEE Koji Kobayashi Computers and Communications Award, the 2017 Applied Probability Society Best Publication Award and the 2015 IEEE INFOCOM Achievement Award. He also won the Distinguished Alumnus Award from the Indian Institute of Technology, Madras.