JOURNAL ARTICLE

K-Dependence Bayesian Classifier Ensemble

Zhiyi DuanLimin Wang

Year: 2017 Journal:   Entropy Vol: 19 (12)Pages: 651-651   Publisher: Multidisciplinary Digital Publishing Institute

Abstract

To maximize the benefit that can be derived from the information implicit in big data, ensemble methods generate multiple models with sufficient diversity through randomization or perturbation. A k-dependence Bayesian classifier (KDB) is a highly scalable learning algorithm with excellent time and space complexity, along with high expressivity. This paper introduces a new ensemble approach of KDBs, a k-dependence forest (KDF), which induces a specific attribute order and conditional dependencies between attributes for each subclassifier. We demonstrate that these subclassifiers are diverse and complementary. Our extensive experimental evaluation on 40 datasets reveals that this ensemble method achieves better classification performance than state-of-the-art out-of-core ensemble learners such as the AODE (averaged one-dependence estimator) and averaged tree-augmented naive Bayes (ATAN).

Keywords:
Ensemble learning Naive Bayes classifier Computer science Artificial intelligence Random forest Machine learning Scalability Classifier (UML) Bayesian probability Pattern recognition (psychology) Data mining Support vector machine

Metrics

16
Cited By
2.52
FWCI (Field Weighted Citation Impact)
32
Refs
0.91
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Bayesian Modeling and Causal Inference
Physical Sciences →  Computer Science →  Artificial Intelligence
Data Mining Algorithms and Applications
Physical Sciences →  Computer Science →  Information Systems
Metabolomics and Mass Spectrometry Studies
Life Sciences →  Biochemistry, Genetics and Molecular Biology →  Molecular Biology
© 2026 ScienceGate Book Chapters — All rights reserved.