JOURNAL ARTICLE

Bayesian Network Structure Learning Based on Parallel Predictive Simulated Annealing

Abstract

Simulated Annealing (SA) is an effective method for Bayesian Network Structure Learning (BNSL). However, when handling with large-scale data, a significant search time is required. Moreover, to maintain parallel efficiency, the traditional multi-chain SA parallelization approach often requires a reduction in the number of iterations. This leads to insufficiently thorough searches when many threads are employed. Additionally, SA employs an optimal-selection update strategy during the information exchange process, which makes it prone to becoming trapped in the local optima. To address these issues, this study proposes a BNSL algorithm based on a Parallel Prediction-Based SA (PPBSA) algorithm. This algorithm ensures thoroughness in the search during the parallelization process and possesses the ability to escape local optima during the information-exchange phase. In the annealing stage of PPBSA, several generations of predicted solutions and their corresponding scores following the current solution are generated in parallel. This approach aims to guarantee search depth while substantially accelerating the search process by reducing the time spent generating and scoring subsequent solutions. When threads exchange information, a tabu list is used to restrict the search for thread solutions that have fallen into local optima, thereby enhancing the ability of the solutions to escape the local optima. Furthermore, based on the decomposability of the BDeu score, the score difference before and after perturbation in the SA process is directly calculated, significantly reducing computational redundancy. A series of experiments conducted on a set of benchmark BN compares the proposed algorithm with serial SA and other algorithms. The results demonstrate that the proposed algorithm can achieve acceleration effects of more than five times in some cases, while maintaining accuracy.

Keywords:
Simulated annealing Tabu search Local optimum Thread (computing) Benchmark (surveying) Local search (optimization) Adaptive simulated annealing Execution time

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.83
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Bayesian Modeling and Causal Inference
Physical Sciences →  Computer Science →  Artificial Intelligence
Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Parallel Simulated Annealing with a Greedy Algorithm for Bayesian Network Structure Learning

Sangmin LeeSeoung Bum Kim

Journal:   IEEE Transactions on Knowledge and Data Engineering Year: 2019 Vol: 32 (6)Pages: 1157-1166
JOURNAL ARTICLE

Fast Parallel Bayesian Network Structure Learning

Jiantong JiangZeyi WenAjmal Mian

Journal:   2022 IEEE International Parallel and Distributed Processing Symposium (IPDPS) Year: 2022 Vol: 19 Pages: 617-627
JOURNAL ARTICLE

Bayesian network structure learning using quantum annealing

Bryan O’GormanRyan BabbushAlejandro Perdomo‐OrtizAlán Aspuru‐GuzikVadim Smelyanskiy

Journal:   The European Physical Journal Special Topics Year: 2015 Vol: 224 (1)Pages: 163-188
JOURNAL ARTICLE

Parallel and Distributed Bayesian Network Structure Learning

Jian YangJiantong JiangZeyi WenAjmal Mian

Journal:   IEEE Transactions on Parallel and Distributed Systems Year: 2023 Vol: 35 (4)Pages: 517-530
© 2026 ScienceGate Book Chapters — All rights reserved.