JOURNAL ARTICLE

Two-stage subsampling variable selection for sparse high-dimensional generalized linear models

Marinela CapanuMihai GiurcanuColin B. BeggMithat Gönen

Year: 2025 Journal:   Statistical Methods in Medical Research Vol: 34 (7)Pages: 1504-1521   Publisher: SAGE Publishing

Abstract

Although high-dimensional data analysis has received a lot of attention after the advent of omics data, model selection in this setting continues to be challenging and there is still substantial room for improvement. Through a novel combination of existing methods, we propose here a two-stage subsampling approach for variable selection in high-dimensional generalized linear regression models. In the first stage, we screen the variables using smoothly clipped absolute deviance penalty regularization followed by partial least squares regression on repeated subsamples of the data; we include in the second stage only those predictors that were most frequently selected over the subsamples either by smoothly clipped absolute deviance or for having the top loadings in either of the first two partial least squares regression components. In the second stage, we again repeatedly subsample the data and, for each subsample, we find the best Akaike information criterion model based on an exhaustive search of all possible models on the reduced set of predictors. We then include in the final model those predictors with high selection probability across the subsamples. We prove that the proposed first-stage estimator is n 1 / 2 -consistent and that the true predictors are included in the first stage with probability converging to 1. In an extensive simulation study, we show that this two-stage approach outperforms the competitors yielding among the highest probability of selecting the true model while having one of the lowest number of false positives in the settings of logistic, Poisson, and linear regression. We illustrate the proposed method on two gene expression cancer datasets.

Keywords:
Akaike information criterion Deviance (statistics) Model selection Statistics Estimator Linear regression Computer science Feature selection Partial least squares regression Generalized linear model Lasso (programming language) Data set Information Criteria Mathematics Regression Artificial intelligence

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
58
Refs
0.20
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Statistical Methods and Inference
Physical Sciences →  Mathematics →  Statistics and Probability
Gene expression and cancer classification
Life Sciences →  Biochemistry, Genetics and Molecular Biology →  Molecular Biology
Bioinformatics and Genomic Networks
Life Sciences →  Biochemistry, Genetics and Molecular Biology →  Molecular Biology

Related Documents

JOURNAL ARTICLE

Subsampling based variable selection for generalized linear models

Marinela CapanuMihai GiurcanuColin B. BeggMithat Gönen

Journal:   Computational Statistics & Data Analysis Year: 2023 Vol: 184 Pages: 107740-107740
JOURNAL ARTICLE

Variable selection in high-dimensional double generalized linear models

Dengke XuZhongzhan ZhangLiucang Wu

Journal:   Statistical Papers Year: 2012 Vol: 55 (2)Pages: 327-347
JOURNAL ARTICLE

Variable selection in high-dimensional sparse multiresponse linear regression models

Shan Luo

Journal:   Statistical Papers Year: 2018 Vol: 61 (3)Pages: 1245-1267
JOURNAL ARTICLE

Forward variable selection for sparse ultra-high-dimensional generalized varying coefficient models

Toshio HondaChien-Tong Lin

Journal:   Japanese Journal of Statistics and Data Science Year: 2020 Vol: 4 (1)Pages: 151-179
© 2026 ScienceGate Book Chapters — All rights reserved.