JOURNAL ARTICLE

SPARSE REGULARIZATION FOR BI-LEVEL VARIABLE SELECTION

Hidetoshi Matsui

Year: 2015 Journal:   Journal of the Japanese Society of Computational Statistics Vol: 28 (1)Pages: 83-103

Abstract

Sparse regularization provides solutions in which some parameters are exactly zero and therefore they can be used for selecting variables in regression models and so on. The lasso is proposed as a method for selecting individual variables for regression models. On the other hand, the group lasso selects groups of variables rather than individuals and therefore it has been used in various fields of applications. More recently,penalties that select variables at both the group and individual levels has been considered. They are so called bi-level selection. In this paper we focus on some penalties that aim for bi-level selection. We overview these penalties and estimation algorithms,and then compare the effectiveness of these penalties from the viewpoint of accuracy of prediction and selection of variables and groups through simulation studies.

Keywords:
Lasso (programming language) Feature selection Regularization (linguistics) Elastic net regularization Selection (genetic algorithm) Regression Mathematics Regression analysis Computer science Variables Mathematical optimization Machine learning Statistics Artificial intelligence

Metrics

2
Cited By
0.00
FWCI (Field Weighted Citation Impact)
77
Refs
0.18
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Statistical Methods and Inference
Physical Sciences →  Mathematics →  Statistics and Probability
Control Systems and Identification
Physical Sciences →  Engineering →  Control and Systems Engineering
Sparse and Compressive Sensing Techniques
Physical Sciences →  Engineering →  Computational Mechanics

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.