WNCG Seminar Series

Seminar
Friday, November 13, 2015
UTA 7.532

Abstract: Submodular functions capture a wide spectrum of discrete problems in machine learning, signal processing and computer vision. They are characterized by intuitive notions of diminishing returns and economies of scale, and often lead to practical algorithms with theoretical guarantees.

In the first part of this talk, I will give a general introduction to the concept of submodular functions, their optimization and example applications in machine learning.

In the second part, I will demonstrate how the close connection of submodularity to convexity leads to fast algorithms for minimizing a subclass of submodular functions - those decomposing as a sum of submodular functions. Using a specific relaxation, the algorithms solve the discrete submodular optimization problem as a "best approximation" problem. They are easy to use and parallelize, and solve both the convex relaxation and the original discrete problem. Their convergence analysis combines elements of geometry and spectral graph theory.

This is joint work with Robert Nishihara, Francis Bach, Suvrit Sra and Michael I. Jordan.

Watch the full talk on the WNCG YouTube Channel:

Watch Part 1 Here

Watch Part 2 Here

Speaker

Photo: Stefanie Jegelka
Associate Professor
Massachusetts Institute of Technology

Stefanie Jegelka is an X-Window Consortium Career Development Associate Professor in the Department of EECS at MIT, and a member of the Computer Science and AI Lab (CSAIL). Before joining MIT, she was a postdoctoral researcher at UC Berkeley, and obtained her PhD from ETH Zurich and the Max Planck Institute for Intelligent Systems. Stefanie has received a Sloan Research Fellowship, an NSF CAREER Award, a DARPA Young Faculty Award, the German Pattern Recognition Award and a Best Paper Award at ICML. Her research interests span the theory and practice of algorithmic machine learning, including discrete and continuous optimization, discrete probability, and learning with structured data.