Channel Estimation in Millimeter Wave MIMO Systems with One-Bit Quantization

29 Jan 2015
Millimeter wave (mmWave) is a technology that can provide high bandwidth communication links in cellular systems. As mmWave uses larger bandwidths, the corresponding sampling rate of the analog-to-digital converter (ADC) scales up. Unfortunately, high speed, high resolution (e.g., 6-12 bits) ADCs are costly and power-hungry for portable devices. A possible solution is to live with ultra low resolution ADCs (1-3 bits), which reduces power consumption and cost. The vast majority of decoding methods including our previous work assume perfect channel estimates, which is never the case in practice. Since the received signal is coarsely quantized, the channel estimation is a challenging problem.
 
To address this problem, WNCG graduate student Jianhua Mo, Ohio State University Professor Philip Schniter, Universidade de Vigo Professor Nuria González Prelcic and WNCG professor Robert Heath developed channel estimation algorithms for mmWave multiple-input multiple-output (MIMO) systems with one-bit ADCs. Since the mmWave MIMO channel is sparse due to the propagation characteristics, the estimation problem was formulated as a one-bit compressed sensing problem. They presented a solution using the generalized approximate message passing (GAMP) algorithm to solve this optimization problem. Their results showed that GAMP can reduce the mean squared error in the important low and medium SNR regions.
 
This work was presented at the 2014 Asilomar Conference on Signals, Systems, and Computers. The full paper is available HERE
 
The research was funded by the National Science Foundation under Grant Nos. NSF-CCF-1319556, NSF-CCF-1018368 and NSF-CCF-1218754.