What multislope path loss models tell us about the fundamental limits of wireless network densification in 5G and beyond

Affiliates Only Files

08 Apr 2015

A vast majority of the increased mobile data throughput has been enabled by ever-increasing densification, i.e. adding more base stations and access points that have a wired backhaul connection.  This trend is set to continue for the next decade at least, primarily through the provisioning of small cells such as pico and femtocells.  What if we reached a point where adding more infrastructure did not allow increased wireless network throughput?  This would be comparable to the impending end of "Moore's Law"; a cataclysmic event having far-reaching consequences.

Existing cellular network analyses, and even simulations, typically use the standard path loss model where received power decays like d-a over a distance d, where a is called the "path loss exponent".  This standard path loss model is quite idealized, and in most scenarios the path loss exponent is itself a function of distance, typically an increasing one.  For example, there could easily be three distinct regimes in a practical environment: a distance-independent "near field" where a0 = 0, a free-space like regime where a1 = 2, and finally some heavily-attenuated regime where a2 > 2.   Such a situation results even with a simple 2-ray ground reflection, with a2 = 4 in that case.  What happens if densification pushes many BSs into the near-field in such a situation?  What are critical values of the path loss exponents where cell splitting no longer yields throughput gains?

To answer such questions, we have developed a general multislope path loss model and used it to derive the distributions of SIR, SNR, and finally SINR (Signal to Interference plus Noise Ratio) before finding the potential throughput scaling, which provides insight on the observed cell-splitting rate gain.  Our mathematical results show that the SIR monotonically decreases with network density, while the converse is true for SNR, and thus the network coverage probability in terms of SINR is maximized at some finite density.   With ultra-densification (technically, the network density going to infinity), there exists a phase transition in the near-field path loss exponent.  If a0 > 1, unbounded potential throughput can be achieved asymptotically; otherwise ultra-densification leads in the extreme case to zero throughput!