We present two examples of using machine learning to improve end-user quality of experience (QoE) in cellular networks operating today. In particular, we demonstrate how to automate the clearing of operational faults in outdoor networks and compensation of signal impairments in indoor networks for voice-over-LTE (VoLTE) applications. Our proposed methods are compatible with 3GPP LTE Release 8 and higher.
WNCG Profs. François Baccelli and Gustavo de Veciana and alumnus Pranav Madadi proposed a proposed a stochastic geometry framework to study temporal performance variations experienced by a mobile user in a cellular network. The focus is on the variations of the Signal-to-Noise-Ratio (SNR) and the downlink Shannon rate experienced when the user moves across a Poisson cellular network on the Euclidean plane.
Existing cellular network analyses, and even simulations, typically use the standard path loss model where received power decays 1/d^x over a distance d, with a pathloss exponent x. This model leads to tractable analysis of downlink cellular network performance with base stations distributed by a Poisson point process. However, it is widely known that this standard path loss model is quite idealized, and that in most scenarios the path loss exponent x is itself a function of d.
In existing 3G and 4G cellular networks, mobile devices associate with the same base station(BS) in the downlink(DL) and uplink(UL) directions. A key reason for this is that the overhead and control channels in each direction help inform communication in the other direction. For example, in LTE, the resource block assignments for both the DL and UL are given in DL control messages.