Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

5-2016

Abstract

Collective graphical models (CGMs) provide a framework for reasoning about a population of independent and identically distributed individuals when only noisy and aggregate observations are given. Previous approaches for inference in CGMs work on a junction-tree representation, thereby highly limiting their scalability. To remedy this, we show how the Bethe entropy approximation naturally arises for the inference problem in CGMs. We reformulate the resulting optimization problem as a difference-of-convex functions program that can capture different types of CGM noise models. Using the concave-convex procedure, we then develop a scalable message-passing algorithm. Empirically, our approach is highly scalable and accurate for large graphs, more than an order-of-magnitude faster than a generic optimization solver, and is guaranteed to converge unlike the previous message-passing approach NLBP that fails in several loopy graphs.

Keywords

Approximate inference, Concave-convex procedure, Difference of convex functions, Entropy approximations, Generic optimization, Inference problem, Message passing algorithm, Optimization problems

Discipline

Artificial Intelligence and Robotics | Computer Sciences | Operations Research, Systems Engineering and Industrial Engineering

Research Areas

Intelligent Systems and Optimization

Publication

Proceedings of Machine Learning Research: 19th International Conference on Artificial Intelligence and Statistics AISTATS 2016, Cadiz, Spain, May 9-11

Volume

51

First Page

685

Last Page

693

Publisher

JMLR

City or Country

Cambridge, MA

Copyright Owner and License

Authors

Additional URL

http://www.jmlr.org/proceedings/papers/v51/nguyen16b.pdf

Share

COinS