WNCG - Wireless Networking and Communications Group - Data Mining
https://wncg.org/tags/data-mining
enQuadratic Maximization Problems
https://wncg.org/research/briefs/quadratic-maximization-problems
<div class="field field-name-field-publish-date field-type-datetime field-label-hidden"><div class="field-items"><div class="field-item even"><span class="date-display-single">Tuesday, July 29, 2014</span></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"> <p>Several optimization problems in machine learning, data mining and graph theory can be expressed as quadratic maximization problems, subject to integrality, positivity, or sparsity constraints. These include Sparse PCA, Densest Subgraph, Nonnegative matrix factorization, MaxCut, Maximum clique and many others. These problems are known to be computationally intractable and, in many cases, hard to approximate. WNCG Profs. Alex Dimakis and Constantine Caramanis, with students Dimitrios Papailioupoulos, Ioannis Mitliagkas and Megasthenis Asteris are developing a novel technique that can solve these problems exactly when the involved quadratic form matrix is positive semidefinite and low rank, even under combinatorial constraints. This is achieved by transforming the low-rank space using hyperspherical coordinates in a method called the spannogram, which allows researchers to handle constraints like integrality or sparsity.</p>
<p>Clearly, real-world data sets rarely produce exactly low-rank matrices. For that reason, WNCG researchers obtain low-rank approximations and performance bounds that depend on the spectral decay of the data matrix eigenvalues. The WNCG team are developing a general framework by combining low-rank approximations with low-rank quadratic optimization. For some problems, the researchers obtain excellent data-dependent bounds and algorithms that outperform the previous state of the art. These papers have been accepted to the highly selective International Conference on Machine Learning (ICML). </p>
<p><strong>Sparse PCA</strong></p>
<p>Principal Component Analysis (PCA) reduces data dimensionality by projecting it onto principal subspaces spanned by the leading eigenvectors of the sample covariance matrix. PCA is arguably the workhorse of high dimensional analysis (one of the most widely used algorithms with applications ranging from computer vision, document clustering to network anomaly detection). Sparse PCA is a useful variant that offers higher data interpretability. In WNCG's recent work, the team developed their framework and used it to design a novel algorithm for Sparse PCA. For several datasets, the WNCG researchers obtained excellent empirical performance and provable upper bounds that guarantee their objective is close to the unknown optimum. </p>
<p>Paper: <a href="http://users.ece.utexas.edu/~dimakis/NNSPCA_ICML.pdf">Non-negative Sparse PCA with Provable Guarantees</a></p>
<p><strong>Densest-k-Subgraph</strong> </p>
<p>Given a large graph and a parameter k, WNCG researchers are interested in detecting a small dense subgraph of size k embedded into an unweighted undirected background. The Densest-k-Subgraph (DkS) problem is fundamental for many applications, including graph and cluster analysis, cyber-community detection and computer security. Using this framework, the team developed a novel algorithm with provable approximation guarantees for DkS. Furthermore, the research team implemented a distributed version of our algorithm using the MapReduce framework, scaling up to 800 cores on Amazon EC2. This allowed us to find dense clusters in massive graphs with billions of edges.</p>
<p>Paper: <a href="http://users.ece.utexas.edu/~dimakis/DKS_ICML.pdf">Finding Dense Subgraphs via Low-Rank Bilinear Optimization</a></p>
<p><strong>Nonnegative Sparse PCA</strong></p>
<p>For a given matrix A, nonnegative sparse PCA requires finding a sparse vector with nonnegative entries x that maximizes the quadratic form xTAx. This is a computationally intractable problem that relates to obtaining an approximate nonnegative matrix factorization (NMF). Such nonnegative factorizations are useful in numerous applications, including air emission control, graph clustering, text mining and hyperspectral data analysis for remote sensing. In a recent work, the WNCG team extended their spannogram methodology into handling non-negativity constraints. This allows researchers to obtain a novel algorithm for NMF and some novel approximation guarantees under spectral constraints. </p>
<p>Paper: <a href="http://jmlr.org/proceedings/papers/v28/papailiopoulos13.pdf">Sparse PCA through Low-Rank Approximations</a></p>
</div></div></div><div class="field field-name-field-related-faculty field-type-node-reference field-label-inline clearfix"><div class="field-label">Related Faculty: </div><div class="field-items"><div class="field-item even"><a href="/people/faculty/alex-dimakis">Alex Dimakis</a></div><div class="field-item odd"><a href="/people/faculty/constantine-caramanis">Constantine Caramanis</a></div></div></div><div class="field field-name-field-related-students field-type-node-reference field-label-inline clearfix"><div class="field-label">Related Researchers: </div><div class="field-items"><div class="field-item even"><a href="/people/students/dimitris-papailiopoulos">Dimitris Papailiopoulos</a></div><div class="field-item odd"><a href="/people/students/megasthenis-asteris">Megasthenis Asteris</a></div><div class="field-item even"><a href="/people/students/ioannis-mitliagkas">Ioannis Mitliagkas</a></div></div></div><div class="field field-name-field-tags field-type-taxonomy-term-reference field-label-inline clearfix"><div class="field-label">Keywords: </div><div class="field-items"><div class="field-item even"><a href="/tags/quadratic-maximization-problems">quadratic maximization problems</a>, <a href="/tags/machine-learning">Machine Learning</a>, <a href="/tags/data-mining">Data Mining</a>, <a href="/tags/graph-theory">graph theory</a></div></div></div>Tue, 29 Jul 2014 17:01:35 +0000lab27993501 at https://wncg.orghttps://wncg.org/research/briefs/quadratic-maximization-problems#commentsProf. Joydeep Ghosh Gives Keynotes at WDDL2013 and DMH 2013
https://wncg.org/news/prof-joydeep-ghosh-gives-keynotes-wddl2013-and-dmh-2013
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"> <p>Prof. Joydeep Ghosh of UT ECE was the keynote speaker at the inaugural Workshop on Divergences and Divergence Learning (WDDl), held in Atlanta, June 2013. In his talk, entitled "Learning Bregman Divergences for Prediction with Generalized Linear Models," which reflects joint work with ECE and WNCG student Sreangsu Acharrya, an efficient approach to learning a broad class of predictive models was introduced. What is most remarkable about this approach is that model parameters can be estimated even when the loss function is unknown. This breakthrough enables one to construct predictive models in both online and batch settings, for certain complex problems for which standard costs such as squared loss are inappropriate.</p>
<p>Prof. Ghosh will also present the keynote address at the International Workshop on Data Mining for Healthcare (DMH), Philadelphia, Sept 11, 2013. The talk, entitled "Predictive Modeling of Large Healthcare Data under Privacy Constraints," will address the fundamental tension between the need to extract value from large quantities of health-related data and the desire to maintain privacy of patients and caregivers. He will discuss two approaches that provide privacy-aware predictive modeling with little degradation in model quality despite restrictions on what can be shared or analyzed. The first approach focuses on extracting predictive value from data that has been aggregated at various levels due to privacy concerns, while the second introduces a novel, non-parametric Gibbs sampler that can generate "realistic but not real" data given a dataset that cannot be shared as is. This is joint work in conjunction with ECE and WNCG student Yu Bin Park.</p>
</div></div></div><div class="field field-name-field-tags field-type-taxonomy-term-reference field-label-above"><div class="field-label">Keywords: </div><div class="field-items"><div class="field-item even"><a href="/tags/keynote">Keynote</a></div><div class="field-item odd"><a href="/tags/data-mining">Data Mining</a></div><div class="field-item even"><a href="/tags/predictive-modeling">Predictive Modeling</a></div></div></div><div class="field field-name-field-publish-date field-type-datetime field-label-above"><div class="field-label">Publish Date: </div><div class="field-items"><div class="field-item even"><span class="date-display-single">Tuesday, September 3, 2013</span></div></div></div><div class="field field-name-field-image field-type-image field-label-above"><div class="field-label">Key Image: </div><div class="field-items"><div class="field-item even"><img src="https://wncg.org/sites/wncg.org/files/Ghosh-News-Graphic_1.jpeg" width="350" height="250" /></div></div></div><div class="field field-name-field-related-faculty field-type-node-reference field-label-above"><div class="field-label">Related Faculty: </div><div class="field-items"><div class="field-item even"><a href="/people/faculty/joydeep-ghosh">Joydeep Ghosh</a></div></div></div><div class="field field-name-field-related-students field-type-node-reference field-label-above"><div class="field-label">Related Researchers: </div><div class="field-items"><div class="field-item even"><a href="/people/students/yubin-park">Yubin Park</a></div></div></div><div class="field field-name-field-feature field-type-list-boolean field-label-above"><div class="field-label">Feature: </div><div class="field-items"><div class="field-item even">No</div></div></div>Tue, 19 Nov 2013 19:10:59 +0000carrat364 at https://wncg.orghttps://wncg.org/news/prof-joydeep-ghosh-gives-keynotes-wddl2013-and-dmh-2013#comments