Events

Upcoming Events

Fri
Oct 30
9:00 AM
Online

Seminar will be delivered live via Zoom on Friday, October 30, 2020 at 9:00AM - 10:00AM U.S. Central Time (CDT / UTC -5).

The Zoom conferencing system is accessible to UT faculty, staff, and students with support from ITS. Otherwise, you can sign up for a free account on the Zoom website.

Fri
Oct 30
11:00 AM
Online - Live

Machine learning as a service (MLaaS) has emerged as a paradigm allowing clients to outsource machine learning computations to the cloud. However, MLaaS raises immediate security concerns, specifically relating to the integrity (or correctness) of computations performed by an untrusted cloud, and the privacy of the client’s data. In this talk, I discuss frameworks we built on cryptographic tools that can be used for secure deep learning based inference on an untrusted cloud: CryptoNAS (building models for private inference) and SafetyNets (addressing correctness).

Thu
Nov 05
11:00 AM
Online

Seminar will be delivered live via Zoom on Thursday, November 5, 2020 at 11:00AM - 12:00PM U.S. Central Time (CST / UTC -6).

The Zoom conferencing system is accessible to UT faculty, staff, and students with support from ITS. Otherwise, you can sign up for a free account on the Zoom website.

Fri
Nov 13
11:00 AM
Online

Maryam Fazel (University of Washington)

Date: Friday, November 13, 2020
Time: 11:00 AM – 12:00 PM (CST; UTC -6)
Location: Online (Zoom link will be provided)

Title: TBD

Abstract: TBD

Fri
Nov 20
11:00 AM
Online

Graph Neural Networks (GNNs) have become a popular tool for learning representations of graph-structured inputs, with applications in computational chemistry, recommendation, pharmacy, reasoning, and many other areas.

Recent Events

23 Oct 2020

The focus of our work is to obtain finite-sample and/or finite-time convergence bounds of various model-free Reinforcement Learning (RL) algorithms. Many RL algorithms are special cases of Stochastic Approximation (SA), which is a popular approach for solving fixed point equations when the information is corrupted by noise. We first obtain finite-sample bounds for general SA using a generalized Moreau envelope as a smooth potential/ Lyapunov function.

16 Oct 2020

Overparameterized neural networks have proved to be remarkably successful in many complex tasks such as image classification and deep reinforcement learning. In this talk, we will consider the role of explicit regularization in training overparameterized neural networks. Specifically, we consider ReLU networks and show that the landscape of commonly used regularized loss functions have the property that every local minimum has good memorization and regularization performance. Joint work with Shiyu Liang and Ruoyu Sun.

13 Oct 2020

Abstract TBA

 

Event time is 11:00AM - 12:00PM Central (CDT; UTC -5)     

Access: Seminar will be delivered live; on the date and time shown above via Zoom. Access link TBA.

The Zoom conferencing system is accessible to UT faculty, staff, and students with support from ITS. Otherwise, you can sign up for a free account on the Zoom website.

02 Oct 2020

Federated Learning has emerged as an important paradigm in modern large-scale machine learning, where the training data remains distributed over a large number of clients, which may be phones, network sensors, hospitals, etc. A major challenge in the design of optimization methods for Federated Learning is the heterogeneity (i.e. non i.i.d. nature) of client data. This problem affects the currently dominant algorithm deployed in practice known as Federated Averaging (FedAvg): we provide results for FedAvg quantifying the degree to which this problem causes unstable or slow convergence.