HUANG Yun, CHEN Ruoyan, MA Li, CAI Yiming, LU Hengyang, FANG Wei
Simulated Annealing (SA) is an effective method for Bayesian Network Structure Learning (BNSL). However, when handling with large-scale data, a significant search time is required. Moreover, to maintain parallel efficiency, the traditional multi-chain SA parallelization approach often requires a reduction in the number of iterations. This leads to insufficiently thorough searches when many threads are employed. Additionally, SA employs an optimal-selection update strategy during the information exchange process, which makes it prone to becoming trapped in the local optima. To address these issues, this study proposes a BNSL algorithm based on a Parallel Prediction-Based SA (PPBSA) algorithm. This algorithm ensures thoroughness in the search during the parallelization process and possesses the ability to escape local optima during the information-exchange phase. In the annealing stage of PPBSA, several generations of predicted solutions and their corresponding scores following the current solution are generated in parallel. This approach aims to guarantee search depth while substantially accelerating the search process by reducing the time spent generating and scoring subsequent solutions. When threads exchange information, a tabu list is used to restrict the search for thread solutions that have fallen into local optima, thereby enhancing the ability of the solutions to escape the local optima. Furthermore, based on the decomposability of the BDeu score, the score difference before and after perturbation in the SA process is directly calculated, significantly reducing computational redundancy. A series of experiments conducted on a set of benchmark BN compares the proposed algorithm with serial SA and other algorithms. The results demonstrate that the proposed algorithm can achieve acceleration effects of more than five times in some cases, while maintaining accuracy.
Jiantong JiangZeyi WenAjmal Mian
Bryan O’GormanRyan BabbushAlejandro Perdomo‐OrtizAlán Aspuru‐GuzikVadim Smelyanskiy
Jian YangJiantong JiangZeyi WenAjmal Mian