Xiangming MengSheng WuJiang Zhu
In this letter, we present a unified Bayesian inference framework for\ngeneralized linear models (GLM) which iteratively reduces the GLM problem to a\nsequence of standard linear model (SLM) problems. This framework provides new\nperspectives on some established GLM algorithms derived from SLM ones and also\nsuggests novel extensions for some other SLM algorithms. Specific instances\nelucidated under such framework are the GLM versions of approximate message\npassing (AMP), vector AMP (VAMP), and sparse Bayesian learning (SBL). It is\nproved that the resultant GLM version of AMP is equivalent to the well-known\ngeneralized approximate message passing (GAMP). Numerical results for 1-bit\nquantized compressed sensing (CS) demonstrate the effectiveness of this unified\nframework.\n
Matthias SeegerSebastian GerwinnMatthias Bethge
Youyi FongHåvard RueJon Wakefield
Niansheng TangDe-Wang LiAn‐Min Tang