JOURNAL ARTICLE

Accelerated Message Passing for Entropy-Regularized MAP Inference

Jonathan LeeAldo PacchianoPeter L. BartlettMichael I. Jordan

Year: 2020 Journal:   arXiv (Cornell University) Vol: 1 Pages: 5736-5746   Publisher: Cornell University

Abstract

Maximum a posteriori (MAP) inference in discrete-valued Markov random fields is a fundamental problem in machine learning that involves identifying the most likely configuration of random variables given a distribution. Due to the difficulty of this combinatorial problem, linear programming (LP) relaxations are commonly used to derive specialized message passing algorithms that are often interpreted as coordinate descent on the dual LP. To achieve more desirable computational properties, a number of methods regularize the LP with an entropy term, leading to a class of smooth message passing algorithms with convergence guarantees. In this paper, we present randomized methods for accelerating these algorithms by leveraging techniques that underlie classical accelerated gradient methods. The proposed algorithms incorporate the familiar steps of standard smooth message passing algorithms, which can be viewed as coordinate minimization steps. We show that these accelerated variants achieve faster rates for finding $ε$-optimal points of the unregularized problem, and, when the LP is tight, we prove that the proposed algorithms recover the true MAP solution in fewer iterations than standard message passing algorithms.

Keywords:
Message passing Coordinate descent Computer science Approximate inference Maximum a posteriori estimation Inference Markov chain Belief propagation Markov decision process Markov random field Entropy (arrow of time) Linear programming Algorithm Principle of maximum entropy Gradient descent Mathematical optimization Markov process Mathematics Decoding methods Artificial intelligence Machine learning Artificial neural network

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Machine Learning and Algorithms
Physical Sciences →  Computer Science →  Artificial Intelligence
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Entropy Message Passing

Velimir M. IlicMiomir S. StankoviBranimir T. Todorovic

Journal:   IEEE Transactions on Information Theory Year: 2010 Vol: 57 (1)Pages: 375-380
JOURNAL ARTICLE

Gene-network inference by message passing

A BraunsteinA PagnaniM WeigtR Zecchina

Journal:   Journal of Physics Conference Series Year: 2008 Vol: 95 Pages: 012016-012016
© 2026 ScienceGate Book Chapters — All rights reserved.