Several optimization problems in machine learning, data mining and graph theory can be expressed as quadratic maximization problems, subject to integrality, positivity, or sparsity constraints. These include Sparse PCA, Densest Subgraph, Nonnegative matrix factorization, MaxCut, Maximum clique and many others. These problems are known to be computationally intractable and, in many cases, hard to approximate. WNCG Profs.
Prof. Joydeep Ghosh of UT ECE was the keynote speaker at the inaugural Workshop on Divergences and Divergence Learning (WDDl), held in Atlanta, June 2013. In his talk, entitled "Learning Bregman Divergences for Prediction with Generalized Linear Models," which reflects joint work with ECE and WNCG student Sreangsu Acharrya, an efficient approach to learning a broad class of predictive models was introduced. What is most remarkable about this approach is that model parameters can be estimated even when the loss function is unknown.