Inho ChoSoya ParkSejun ParkDongsu HanJinwoo Shin
Graphical Model (GM) has provided a popular framework for big data analytics because it often lends itself to distributed and parallel processing by utilizing graph-based 'local' structures. It models correlated random variables where in particular, the max-product Belief Propagation (BP) is the most popular heuristic to compute the most-likely assignment in GMs. In the past years, it has been proven that BP can solve a few classes of combinatorial optimization problems under certain conditions. Motivated by this, we explore the prospect of using BP to solve generic combinatorial optimization problems. The challenge is that, in practice, BP may converge very slowly and even if it does converge, the BP decision often violates the constraints of the original problem. This paper proposes a generic framework that enables us to apply BP-based algorithms to compute an approximate feasible solution for an arbitrary combinatorial optimization task. The main novel ingredients include (a) careful initialization of BP messages, (b) hybrid damping on BP updates, and (c) post-processing using BP beliefs. Utilizing the framework, we develop parallel algorithms for several large-scale combinatorial optimization problems including maximum weight matching, vertex cover and independent set. We demonstrate that our framework delivers high approximation ratio, speeds up the process by parallelization, and allows large-scale processing involving billions of variables.
Alexander G. SchwingTamir HazanMarc PollefeysRaquel Urtasun
Jingcheng WangYong ZhangYongli HuBaocai Yin