
CVPR 2011 Workshop on Inference in Graphical Models with Structured Potentials 
Introduction
Graphical models are now ubiquitous in numerous computer vision applications from lowlevel problems such as multiclass image labeling and stereo reconstruction to highlevel problems such as articulated object detection. Many of these problems exhibit specific structure, such as regular neighborhoods or submodular potentials, that can be exploited to reduce worstcase inference complexity. Many of these complexity results are published in distinct communities (possibly even outside of computer vision); this workshop aims to combine these results under a common theme.
Schedule  June 20
8:50: Introduction
9:05: Pedro Felzenszwalb
Fast Inference with MinSum Matrix Product
The MAP inference problem in many graphical models can be solved efficiently using a fast algorithm for computing minsum products of n by n matrices. The class of models in question includes cyclic and skipchain models that arise in many applications. Although the worstcase complexity of the minsum product operation is not known to be much better than O(n^3), an O(n^2.5) expected time algorithm was recently given, subject to some constraints on the input matrices. In this talk I will describe an algorithm that runs in O(n^2 log n) expected time, assuming the entries in the input matrices are independent samples from a uniform distribution. I will also show that two variants of the algorithm are quite fast for inputs that arise in several applications. This leads to significant performance gains over previous methods in applications within computer vision and natural language processing. This is joint work with Julian McAuley.
9:40: Richard Zemel
Learning and Inference to Exploit HighOrder Potentials
An important challenge is to develop models that can capture a variety of structure in the real world, and can effectively represent our prior conceptions of this structure, as well as our objectives for the model. Many such structures are highorder, in that they involve interactions between many variables, and as such cannot be utilized efficiently in standard graphical models. In a number of practical settings, however, high order interactions can be utilized efficiently by leveraging specific withinfactor structure. We explore two applications of this tractability: highorder models, and highorder losses. Both rely on a common underlying factor graph formulation, with a form of maxproduct for inference. This method is simple, modular, and flexible; it is easy to define new factors, and experimental results are good on a range of problems of various sizes. I will discuss a few projects within this framework, focusing on image labeling as an application example: learning to optimize counts potentials; using a suite of highorder potentials for joint object and depth labeling; and learning using highorder losses.
10:15: Coffee break
10:40: Dhruv Batra
Focussed Inference in Markov Random Fields with Local PrimalDual Gaps
Markov Random Fields have become the workhorses of modern machine learning and computer vision. A number of situations require Dynamic Adaptive Inference algorithms that must adapt and reorder computations to "important" parts of the problem. In this talk I will describe one such measure for identifying important parts of the problem  called Local Primal Dual Gaps (LPDG). LPDG is based on complementary slackness conditions in the PrimalDual pair of Linear Programs (LP) in the LP relaxation of MAP inference. We have found LPDG to be useful in a number of situations  speedingup messagepassing algorithms by reordering message computations (Tarlow et al. ICML '11), speeding up alphaexpansion by reordering label sweeps (Batra & Kohli CVPR '11) and adaptive tightening of the standard LP relaxation by choosing important constraints to add (Batra et al. AISTATS '11). Joint work with Pushmeet Kohli (MSRC), Sebastian Nowozin (MSRC), Daniel Tarlow (UToronto) and Vladimir Kolmogorov (UCL).
11:15: Spotlights
Twominute spotlights by ChangDong Wang, Steve Gu, Stefanie Jegelka, Julian Yarkony, Dongjin Kwon, Kyong Joon Lee, and Danny Tarlow
11:30: Poster session, followed by lunch
1:30: Sebastian Nowozin
From Potentials to Polyhedra: Inference in Structured Models
Many computer vision tasks such as scene understanding and object recognition are nowadays routinely addressed with sophisticated statistical models for discrete variables. The most efficient algorithms for MAP inference in these models are based on linear programming relaxations to linear integer programming formulations. This establishes a connection between problems of discrete inference and the study of integral polytopes, that is, the field of polyhedral combinatorics. In this talk I distinguish decomposable and nondecomposable higherorder interactions in discrete models and describe how most structured potentials are interpretable as so called 'extended formulation' integer programs. For probabilistic inference the situation is different, and generally harder.
2:05: Nikos Komodakis
Fast Training of Pairwise or Higherorder MRFs
Markov Random Fields are ubiquitous in computer vision, which is why related MAP inference algorithms have attracted a significant amount of research over the past years. However, besides MAP estimation, the task of learning the parameters of an MRF plays an equally, if not more, important role for successfully applying the resulting models to computer vision problems. In this talk I will present a very efficient learning algorithm for this task. The proposed method is quite general as well as very flexible, and it can be easily applied to both pairwise and higherorder MRF models.
2:40: Nebojsa Jojic
Accepted Papers
MessagePassing for the Traveling Salesman Problem
ChangDong Wang, JianHuang Lai, WeiShi Zheng
Extended Pairwise Potentials
Steve Gu, Ying Zheng, Carlo Tomasi
Multilabel cooperative cuts
Stefanie Jegelka, Jeff Bilmes
Planar Decompositions and Cycle Constraints
Julian Yarkony, Ragib Morshed, Alexander T. Ihler, Charless C. Fowlkes
Nonrigid Image Registration Using a HigherOrder MRF Model with Dense Local Descriptors
Dongjin Kwon, Kyong Joon Lee, Il Dong Yun, Sang Uk Lee
Inference with l0normbased Sparsity Prior on Discrete Frameworks
Kyong Joon Lee, Dongjin Kwon, Il Dong Yun, Sang Uk Lee
Big and Tall: Large Margin Learning with High Order Losses
Daniel Tarlow, Richard Zemel
Resources
Graph cuts
Y. Boykov, V. Kolmogorov
Belief propagation for early vision
P.F. Felzenszwalb, D.P. Huttenlocher
FastPD MRF optimization
N. Komodakis, G. Tziritas, N. Paragios
Learning for highlevel vision
S. Nowozin
Messagepassing for LP relaxations
D. Sontag
Structured potentials for dual decompositionbased inference
D. Sontag, A. Globerson, T. Jaakkola
Inference in graphical models (Matlab interfaces)
S. Bagon
Workshop References
Beyond trees: MRF inference via outerplanar decomposition
D. Batra, A.C. Gallagher, D. Parikh, T.H. Chen,
CVPR 2010
Fast approximate energy minimization via graph cuts
Y. Boykov, O. Veksler, R. Zabih
PAMI 2004
Residual belief propagation: Informed scheduling for asynchronous message passing
G. Elidan,
UAI 2006
Efficient belief propagation for early vision
P. F. Felzenszwalb, D. P. Huttenlocher,
IJCV 2006
Fast inference with minsum matrix product
P. Felzenszwalb, J. McAuley
Technical Report, University of Chicago, 2010
Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images
S. Geman, D. Geman,
PAMI 1984
Exact optimization for Markov Random Fields with convex priors
H. Ishikawa,
PAMI 2003
Counting belief propagation
K. Kersting, B. Ahmadi, S. Natarajan,
UAI 2009
Inference in Bayesian networks using nested junction trees
U. Kjærulff,
Proceedings of the NATO Advanced Study Institute on Learning in graphical models, 1998
What energy functions can be minimized via graph cuts?
V. Kolmogorov, R. Zabih,
ECCV 2002
Exploiting dataindependence for fast beliefpropagation
J. J. McAuley, T. S. Caetano,
ICML 2010
A differential semantics for jointree algorithms
J. D. Park, A. Darwiche,
NIPS 2003
Fast generalized belief propagation for MAP estimation on 2D and 3D gridlike Markov Random Fields
K. Petersen, J. Fehr, H. Burkhardt,
DAGM 2008
Syntactic analysis of twodimensional visual signals in the presence of noise
M. I. Shlezinger,
Kibernetika 1976
Graph cut based optimization for MRFs with truncated convex priors
O. Veksler,
CVPR 2007
Questions and comments to moc.liamg@yeluacm.nailuj. Last edited 26/6/11