WNCG

Quality-Energy Aware Synthesis of Approximate Hardware

Approximate computing is an aggressive design technique aimed at achieving significant energy savings by trading off computational precision and accuracy in inherently error-tolerant applications. This introduces a new notion of quality as a fundamental design parameter. While ad-hoc solutions have been explored at various levels, systematic design approaches are lacking.

The Burden of Risk Aversion in Selfish Routing

Traffic congestion aggravates the daily life of millions of people around the globe and congestion games from game theory provide a suitable tool to understand its effects and offer insights on how to alleviate it.  Classic congestion games assume deterministic edge delays, while in reality delays are uncertain and risk-averse drivers might prefer longer but safer routes, further exacerbating the problem of increased travel times and emissions.

Modeling and Algorithms for Aggregated Data

Databases in domains such as healthcare are routinely released to the public in aggregated form to preserve privacy. However, naive application of existing modeling techniques on aggregated data is affected by ecological fallacy that can drastically reduce the accuracy of results and often lead to misleading inferences at the individual level. The project by Prof.

Detecting Sponsored Recommendations

With a vast number of items, web-pages, and news to choose from, online services and the customers both benefit tremendously from personalized recommender systems. Such sys- tems however provide great opportunities for targeted adver- tisements, by displaying ads alongside genuine recommendations. We consider a biased recommendation system where such ads are displayed without any tags (disguised as genuine recommendations), rendering them indistinguishable to a single user. We ask whether it is possible for a small subset of collaborating users to detect such a bias.

Scheduling for Stream Computing in the Cloud

Motivated by emerging big streaming data processing paradigms (e.g., Twitter Storm, Streaming MapReduce), we investigate the problem of scheduling graphs over a large cluster of servers. Each graph is a job, where nodes represent compute tasks and edges indicate data-flows between these compute tasks. Jobs (graphs) arrive randomly over time, and upon completion, leave the system. When a job arrives, the scheduler needs to partition the graph and distribute it over the servers to satisfy load balancing and cost considerations.

Generalization of Standard Matrix Completion

Joydeep Ghosh and student Suriya Gunasekar work on the generalization of standard matrix completion in various aspects. In previous work, we have proposed tractable estimators for matrix completion with observations arising from heterogeneous datatypes and heterogeneous noise models. In a more recent work, we focus on consistency results for the collective matrix completion problem of jointly recovering a collection of matrices with shared structure.

Bayesian Sparse Principal Component Analysis

Several real-life high dimension datasets can be reasonably represented as a
linear combination of a few sparse vectors. Succinct representation of such data with a few selected variables is highly desirable for such cases. A Bayesian setup is useful because the limitation of knowing a limited number of high dimensional data points can be alleviated by well-designed domain-specific priors.

What multislope path loss models tell us about the fundamental limits of wireless network densification in 5G and beyond

A vast majority of the increased mobile data throughput has been enabled by ever-increasing densification, i.e. adding more base stations and access points that have a wired backhaul connection.  This trend is set to continue for the next decade at least, primarily through the provisioning of small cells such as pico and femtocells.  What if we reached a point where adding more infrastructure did not allow increased wireless network throughput?  This would be comparable to the impending end of "Moore's Law"; a cataclysmic event having far-reaching consequences.

Networks-of-Systems Simulation

In future computing systems, such as the Internet-of-Things (IoT), functionality is increasingly defined by the networked connectivity of spatially distributed devices. This, however, poses fundamentally new design challenges and tradeoffs. Computation and communication need to be tightly coupled and jointly explored, e.g. to determine whether a functionality should be performed locally or remotely over the network in order to achieve the best performance and energy consumption.

Pages

Subscribe to RSS - WNCG