Workshop on Discrete Graphical Models, November 13-14, 2014

Location
 

 Heidelberg Collaboratory for Image Processing (Speyerer Str. 6, 2nd floor)
 
 

Programme

 
 

                                                                           

   13.11.2014
 


 

12:00 -13:00
 

Lunch (by invitation only)
 

13:00 - 13:15

Introduction

13:15 - 14:00

Tamir HazanProbabilistic inference by randomly perturbing max-solvers [slides]

14:00 - 14:45

Bogdan SavchynskyyGlobal MAP-optimality by shrinking the combinatorial search area with convex relaxations [slides]

14:45 - 15:15

Coffee Break

15:15 - 16:00

Tomas WernerHow Hard Is Solving the LP Relaxation of the Max-Sum Labeling Problem? [slides]

16:00 - 16:45

Thomas SchiexGlobal decomposable cost functions for Cost Function Networks (CFN), aka MAP-MRF [slides]

16:45 - 17:00

Concluding discussions

19:00 - 22:00

Dinner (by invitation only)


 


 

   14.11.2014


 

  9:00 -   9:45

Vladimir KolmogorovExtensions of submodularity and their applications in computer vision [slides]

  9:45 - 10:30

Paul SwobodaPersistency by iterative pruning

10:30 - 11:00 

Coffee Break

11:00 - 11:45

Simon DegivryMulti-paradigm evaluation of exact solvers in graphical model discrete optimisation [slides]

11:45 - 12:00
 

Concluding discussions
 

12:00 - 13:00
 

Lunch (by invitation only) 

Abstracts of the Talks                        

Tamir Hazan

Probabilistic inference by randomly perturbing max-solvers 
 
 Modern inference problems can be increasingly understood in terms of discrete structures such as arrangements of objects in computer vision or molecular structures in computational biology. In a fully probabilistic treatment, all possible alternative assignments are considered thus requiring to estimate exponentially many structures with their respective weights. These computational difficulties are circumvented with a variety of optimization techniques that provide max-solvers to recover the most likely structure. In this talk I will present a new approach for probabilistic inference that is based on randomly perturbing the scoring function of max-solvers, thus measuring the stability of prediction to random perturbations. 
   

Bogdan Savchynskyy

Global MAP-optimality by shrinking the combinatorial search area with convex relaxations

We consider energy minimization for undirected graphical models, also known as the MAP-inference problem for Markov random fields. Although combinatorial methods, which return a provably optimal integral solution of the problem, made a significant progress in the past decade, they are still typically unable to cope with large-scale datasets. On the other hand, large scale datasets are often defined on sparse graphs and convex relaxation methods, such as linear programming relaxations then provide good approximations to integral solutions. 

We propose a novel method of combining combinatorial and convex programming techniques to obtain a global solution of the initial combinatorial problem. Based on the information obtained from the solution of the convex relaxation, our method confines application of the combinatorial solver to a small fraction of the initial graphical model, which allows to optimally solve much larger problems.  
We demonstrate the efficacy of our approach on a computer vision energy minimization benchmark.

Tomas Werner

How Hard Is Solving the LP Relaxation of the Max-Sum Labeling Problem?
 
We show that solving the standard LP relaxation of the min-sum labeling problem (also known as MAP inference problem in graphical models, discrete energy minimization, or valued constraint satisfaction) is not easier than solving any linear program. Precisely, every polytope is linear-time representable by a local marginal polytope and every LP can be reduced in linear time to a linear optimization (allowing infinite costs) over a local marginal polytope. If time allows, we discuss also a similar result for the attractive model and relation of our results to fixed points of the convergent message-passing algorithms.

Thomas Schiex

Global decomposable cost functions for Cost Function Networks (CFN), aka MAP-MRF 
 
 In Constraint Satisfaction, one crucial modeling and solving component has been the introduction of global constraints. These constraints usually have polynomial time generalized arc consistency enforcing algorithms despite unbounded arity, thanks to precise semantics. A large number of these constraints are decomposable in a Berge acyclic network and in this case, Generalized Arc Consistency (GAC) on the decomposition is enough to enforce Arc Consistency on the Global Constraint. For the Weighted Constraint Satisfaction Problem, it is also possible to consider Global Cost Functions. There are however limits on the level of local consistency that can be enforced efficiently. it is also possible to relax Global Constraints in Global Cost functions and inherit decomposability from this relaxation. When the decomposition is Berge acyclic, we directly inherit the corresponding GAC result of CSP if the level of consistency enforced on the CFN  is sufficient. Otherwise, a weaker result can be obtained. 

Vladimir Kolmogorov

Extensions of submodularity and their applications in computer vision
 
 Submodular functions have played a prominent role in computer vision in the last decade; in many cases they can be minimized efficiently via graph cuts. I will describe a larger class of functions that can be minimized in polynomial time by solving a Basic Linear Programming relaxation of the problem. It was recently shown that this is the the only tractable class of Finite-Valued Constraint Satisfaction Problems (VCSPs), assuming that P != NP. 

A special case of the new class is so-called k-submodular functions. In the second part I will describe an application of k-submodular functions to computing partially optimal solutions of certain NP-hard optimization problems. In the case of the Potts energy function there is a close connection to the approach proposed by Kovtun which also computes a part of an optimal solution. I will show how to improve the complexity of the Kovtun's approach from k to O(log k) maxflow computations, where k is the number of labels. The algorithm generalizes the method of Felzenszwalb et al. for Tree Metrics, and resembles a parametric maxflow algorithm.

Paul Swoboda

Partial Optimality by Pruning for MAP-inference with General Graphical Models
 
We consider the energy minimization problem for undirected graphical models, also known as MAP-inference problem for Markov random fields which is NP-hard in general.
We propose a novel polynomial time algorithm to obtain a part of its optimal {\em non-relaxed integral} solution. Our algorithm is initialized with variables taking integral values in the solution of a convex relaxation of the MAP-inference problem and iteratively prunes those, which do not satisfy our criterion for partial optimality. We show that our pruning strategy is in a certain sense theoretically optimal. Also empirically our method outperforms previous approaches in terms of the number of persistently labelled variables. The method is very general, as it is applicable to models with arbitrary factors of an arbitrary order and can employ any solver for the considered relaxed problem. 
Our method's runtime is determined by the runtime of the convex relaxation solver for the MAP-inference problem.

Simon Degivry

Multi-paradigm evaluation of exact solvers in graphical model discrete optimisation
 
 Graphical models on discrete variables allow to model NP-hard optimization problems where the objective function is factorized into a set of local functions. In the graphical interpretation, each function's scope is represented by a  clique. Deterministic graphical models such as weigthed Constraint Satisfaction Problem aim at minimizing the sum of all functions (or constraints if zero/infinite costs are used). Probabilistic graphical models such as Markov Random Fields aim at maximizing the product of all functions (or constraints if using zero/one probabilities). A direct transformation exists between the two frameworks that can also be modeled as weighted MaxSAT or Integer Linear Programming. 
 
 In this presentation, I will report on a large comparison of state-of-the-art  exact solvers on several deterministic and probabilistic graphical models coming from the Probabilistic Inference Challenge 2011, the Computer Vision and Pattern Recognition OpenGM2 benchmark, the Weighted Partial Max-SAT Evaluation 2013, the Max-CSP 2008 Competition, the MiniZinc Challenge 2012-2013, and a library of Cost Function Networks. The results show that a few general-purpose solvers are performing well on multiple paradigms, suggesting the interest for a portfolio approach. By doing so, we won the last UAI Evaluation 2014 for the MAP task. 

Tags: