Min-based (or qualitative) possibilistic networks are important tools to efficiently and compactly represent and analyze uncertain information. Inference is a crucial task in min-based networks, which consists of propagating information through the network structure to answer queries. Exact inference computes posteriori possibility distributions, given some observed evidence, in a time proportional to the number of nodes of the network when it is simply connected (without loops). On multiply connected networks (with loops), exact inference is known as a hard problem. This paper proposes an approximate algorithm for inference in min-based possibilistic networks. More precisely, we adapt the well-known approximate algorithm Loopy Belief Propagation (LBP) on qualitative possibilistic networks. We provide different experimental results that analyze the convergence of possibilistic LBP.
Amen AjroudMohamed Nazih OmriSalem BenferhatHabib Youssef
Nahla Ben AmoSalem BenferhatKhaled Mellouli
Nesrine AmorSalem BenferhatKhaled Mellouli
Raouia AyachiNahla Ben AmorSalem Benferhat
Raouia AyachiNahla Ben AmorSalem Benferhat