JOURNAL ARTICLE

Robust feature selection for block covariance Bayesian models

Abstract

Recent work proposes new algorithms for feature selection based on a Bayesian hierarchical model that places priors on both the identity of all features, and the identity-conditioned feature-label distribution. Given training data, Bayesian inference can be used to predict the feature identities. While algorithms developed in prior work rely on certain independence assumptions, in this work we present a new algorithm, with low computational complexity, designed for a family of Bayesian models that each assume different block covariance structures. We show the new algorithm, and the previous algorithm assuming independent features, have robust performance across the family of models under synthetic data, and provide results from real colon cancer microarray data.

Keywords:
Prior probability Computer science Feature selection Feature (linguistics) Bayesian inference Artificial intelligence Bayesian probability Covariance Machine learning Pattern recognition (psychology) Inference Bayesian hierarchical modeling Data mining Algorithm Mathematics Statistics

Metrics

12
Cited By
1.60
FWCI (Field Weighted Citation Impact)
17
Refs
0.79
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Gene expression and cancer classification
Life Sciences →  Biochemistry, Genetics and Molecular Biology →  Molecular Biology
Machine Learning in Bioinformatics
Life Sciences →  Biochemistry, Genetics and Molecular Biology →  Molecular Biology
Image Retrieval and Classification Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.