Qiuyun ZouHaochuan ZhangHongwen Yang
In this paper, we extend the bilinear generalized approximate message passing\n(BiG-AMP) approach, originally proposed for high-dimensional generalized\nbilinear regression, to the multi-layer case for the handling of cascaded\nproblem such as matrix-factorization problem arising in relay communication\namong others. Assuming statistically independent matrix entries with known\npriors, the new algorithm called ML-BiGAMP could approximate the general\nsum-product loopy belief propagation (LBP) in the high-dimensional limit\nenjoying a substantial reduction in computational complexity. We demonstrate\nthat, in large system limit, the asymptotic MSE performance of ML-BiGAMP could\nbe fully characterized via a set of simple one-dimensional equations termed\nstate evolution (SE). We establish that the asymptotic MSE predicted by\nML-BiGAMP' SE matches perfectly the exact MMSE predicted by the replica method,\nwhich is well known to be Bayes-optimal but infeasible in practice. This\nconsistency indicates that the ML-BiGAMP may still retain the same\nBayes-optimal performance as the MMSE estimator in high-dimensional\napplications, although ML-BiGAMP's computational burden is far lower. As an\nillustrative example of the general ML-BiGAMP, we provide a detector design\nthat could estimate the channel fading and the data symbols jointly with high\nprecision for the two-hop amplify-and-forward relay communication systems.\n
Jason T. ParkerPhilip Schniter
Jason T. ParkerPhilip SchniterVolkan Cevher
Jason T. ParkerPhilip SchniterVolkan Cevher
Jeremy VilaPhilip SchniterJoseph Meola