Virtual Seminar: The Role of Regularization in Overparameterized Neural Networks
Overparameterized neural networks have proved to be remarkably successful in many complex tasks such as image classification and deep reinforcement learning. In this talk, we will consider the role of explicit regularization in training overparameterized neural networks. Specifically, we consider ReLU networks and show that the landscape of commonly used regularized loss functions have the property that every local minimum has good memorization and regularization performance. Joint work with Shiyu Liang and Ruoyu Sun.
Time: 11:00 AM – 12:00 PM Central (CDT; UTC -5)
Access: Seminar was delivered live via Zoom on Friday, October 16. You can watch a video recording of the talk on our YouTube channel here.