JOURNAL ARTICLE

Alleviating naive Bayes attribute independence assumption by attribute weighting

Nayyar A. ZaidiJesús CerquidesMark CarmanGeoffrey I. Webb

Year: 2013 Journal:   Monash University Research Portal (Monash University) Vol: 14 (1)Pages: 1947-1988   Publisher: Monash University

Abstract

Despite the simplicity of the Naive Bayes classifier, it has continued to perform well against more sophisticated newcomers and has remained, therefore, of great interest to the machine learning community. Of numerous approaches to refining the naive Bayes classifier, attribute weighting has received less attention than it warrants. Most approaches, perhaps influenced by attribute weighting in other machine learning algorithms, use weighting to place more emphasis on highly predictive attributes than those that are less predictive. In this paper, we argue that for naive Bayes attribute weighting should instead be used to alleviate the conditional independence assumption. Based on this premise, we propose a weighted naive Bayes algorithm, called WANBIA, that selects weights to minimize either the negative conditional log likelihood or the mean squared error objective functions. We perform extensive evaluations and find that WANBIA is a competitive alternative to state of the art classifiers like Random Forest, Logistic Regression and A1DE. © 2013 Nayyar A. Zaidi, Jesus Cerquides, Mark J. Carman and Geoffrey I. Webb.

Keywords:
Naive Bayes classifier Weighting Machine learning Conditional independence Artificial intelligence Computer science Bayes error rate Classifier (UML) Bayes' theorem Random forest Logistic regression Mathematics Data mining Bayes classifier Support vector machine Bayesian probability

Metrics

171
Cited By
20.75
FWCI (Field Weighted Citation Impact)
43
Refs
0.99
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Bayesian Modeling and Causal Inference
Physical Sciences →  Computer Science →  Artificial Intelligence
Imbalanced Data Classification Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Machine Learning and Data Classification
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Alleviating conditional independence assumption of naive Bayes

Xu-Qing LiuXiao-Cai WangTao LiFengxian AnGui-Ren Jiang

Journal:   Statistical Papers Year: 2023 Vol: 65 (5)Pages: 2835-2863
JOURNAL ARTICLE

Toward naive Bayes with attribute value weighting

Liangjun YuLiangxiao JiangDianhong WangLungan Zhang

Journal:   Neural Computing and Applications Year: 2018 Vol: 31 (10)Pages: 5699-5713
JOURNAL ARTICLE

Self-adaptive attribute weighting for Naive Bayes classification

Jia WuShirui PanXingquan ZhuZhihua CaiPeng ZhangChengqi Zhang

Journal:   Expert Systems with Applications Year: 2014 Vol: 42 (3)Pages: 1487-1502
JOURNAL ARTICLE

Class-specific attribute value weighting for Naive Bayes

Huan ZhangLiangxiao JiangLiangjun Yu

Journal:   Information Sciences Year: 2019 Vol: 508 Pages: 260-274
JOURNAL ARTICLE

A Regularized Attribute Weighting Framework for Naive Bayes

Shihe WangJianfeng RenRuibin Bai

Journal:   IEEE Access Year: 2020 Vol: 8 Pages: 225639-225649
© 2026 ScienceGate Book Chapters — All rights reserved.