Weizhong ZhangLijun ZhangRong JinDeng CaiXiaofei He
In this paper, we present an accelerated numerical method based on random projection for sparse linear regression. Previous studies have shown that under appropriate conditions, gradient-based methods enjoy a geometric convergence rate when applied to this problem. However, the time complexity of evaluating the gradient is as large as $\mathcal{O}(nd)$, where $n$ is the number of data points and $d$ is the dimensionality, making those methods inefficient for large-scale and high-dimensional dataset. To address this limitation, we first utilize random projection to find a rank-$k$ approximator for the data matrix, and reduce the cost of gradient evaluation to $\mathcal{O}(nk+dk)$, a significant improvement when $k$ is much smaller than $d$ and $n$. Then, we solve the sparse linear regression problem via a proximal gradient method with a homotopy strategy to generate sparse intermediate solutions. Theoretical analysis shows that our method also achieves a global geometric convergence rate, and moreover the sparsity of all the intermediate solutions are well-bounded over the iterations. Finally, we conduct experiments to demonstrate the efficiency of the proposed method.
Detao QiWeijun ZengWei TongJian Yang
Qiang SunHongtu ZhuYufeng LiuJoseph G. Ibrahim
Qiang SunHongtu ZhuYufeng LiuJoseph G. Ibrahim
Roman ParzerPeter FilzmoserLaura Vana-Gür