Combinatorial Optimization in Machine Learning and Computer Vision

Dr. Bogdan Savchynskyy, Prof. Dr. Carsten Rother, SoSe 2021

This seminar belongs to the Master in Physics (specialisation Computational Physics, code "MVJC"), Master of Applied Informatics (code "IS") as well as Master Mathematics (code "MS") programs, but is also open for students of Scientific Computing and anyone interested.


Machine learning techniques are tightly coupled with optimization methods. Many techniques become practical only if there exists a supporting optimization tool.

In the seminar we will discuss a number of recent articles on combinatorial optimization with applications in computer vision and machine learning.

The topic of this semester is again

Neural Networks meet Combinatorial Optimization

In particular, we will consider methods for

  • training parameters of combinatorial optimization algorithms with the machine learning techniques,
  • combinatorial optimization based loss-functions for deep learning
and their applications. This semester, the seminar papers are pre-selected by the talk of Michal Rolinek at the Learning Meets Combinatorial Algorithms NeurIPS 2020 Workshop, see also the corresponding NeurIPS weg-page with linked videos of the presentations.

General Information

Please register for the seminar in Müsli. The seminar will be held online in MS Teams, all links will be send per email via Müsli.
Attention: To be able to join the seminar online you must enable MS Teams here. Please do it ASAP, your request may take several days.
The first seminar will take place on Wednesday, April 14 at 16:00. Please make sure to participate!

  • Seminar: Wed, 16:00 – 18:00
  • Credits: 2 / 4 / 6 CP depending on course of study

Seminar Repository:

In HeiBox. No registration needed, password will be shared via Müsli.

Papers to Choose from:

See overwiew slides for the papers here. See also the presentation video.

For a quick overview of the general problem these slides could be also helpful. A large collection of slides for the leading black-box differentiation method can be found here.

[1] M. Vlastelica, A. Paulus, V. Musil, G. Martius, M. Rolínek, Differentiation of Blackbox Combinatorial Solvers, ICLR 2020 (see also slides here).
[2] M. Rolínek, V. Musil, A. Paulus, C. Michaelis, G. Martius, Optimizing Rank-Based Metrics with Blackbox Differentiation, CVPR 2020
[3] M. Rolínek, P. Swoboda, D. Zietlow, V. Musil, A. Paulus, G. Martius, Deep Graph Matching via Blackbox Differentiation, ECCV 2020
[4] A. Paulus, M. Rolínek, V. Musil, B. Amos, G. Martius, Fit the right NP-Hard Problem: End-to-end Learning of Integer Programming Constraints, NeurIPS 2020, LMCA
[5] M. Vlastelica, M. Rolínek, G. Martius, Discrete Planning With Neuro-algorithmic Policies, NeurIPS 2020, LMCA
[6] A. Ferber, B. Wilder, B. Dilkina, M. Tambe, MIPaaL: Mixed Integer Program as a Layer, AAAI 2020
[7] Q. Berthet, M. Blondel, O. Teboul, M. Cuturi, J. Vert, F. Bach, Learning with Differentiable Perturbed Optimizers, NeurIPS 2020
[8] X. Gao, H. Zhang, A. Panahi, T. Arodz, Differentiable Combinatorial Losses through Generalized Gradients of Linear Programs, ArXiv 2020
[9] B. Wilder, B. Dilkina, M. Tambe: Melding the Data-Decisions Pipeline: Decision-Focused Learning for Combinatorial Optimization, AAAI 2019
[10] J. Mandi, E. Demirovic, P. Stuckey, T. Guns, Smart Predict-and-Optimize for Hard Combinatorial Optimization Problems, AAAI 2020
[11] Y. Tan, D. Terekhov. A. Delong, Learning Linear Programs from Optimal Decisions, NeurIPS 2020
[12] T. Gal, Rim Multiparametric Linear Programming, J. Management Science, 1975
[13] R. Freund, Postoptimal Analysis of a Linear Program Under Simultaneous Changes in Matrix Coefficients, 1984
[14] D. De Wolf,, Generalized Derivatives of the Optimal Value of a Linear Program with Respect to Matrix Coefficients, European J. of Operational Research, 2000
[15] B. Amos, Z. Kolter, OptNet: Differentiable Optimization as a Layer in Neural Networks, ICML 2017
[16] A. Agrawal, B. Amos, S. Barratt, Differentiable Convex Optimization Layers, NeurIPS 2019
[17] M. Blondel, O. Teboul, Q. Berthet, J. Djolonga, Fast Differentiable Sorting and Ranking, ICML 2020
[18] R. Kleiman, D. Page AUC_\mu: A Performance Metric for Multi-Class Machine Learning Models, NeurIPS, 2019
[19] K. Ataman, W. Street, Y. Zhang, Learning to Rank by Maximizing AUC with Linear Programming, IJCNNP, 2006
[20] I. Tsochantaridis, T. Joachims, T. Hofmann, Y. Altun, Large Margin Methods for Structured and Interdependent Output Variables, JMLR, 2005
[21] A. Sadat, M. Ren, A. Pokrovsky, Y. Lin, E. Yumer, R. Urtasun, Jointly Learnable Behavior and Trajectory Planning for Self-Driving Vehicles, IROS 2019
[22] A. Elmachtoub, P. Grigas, Smart “Predict, then Optimize”, Optimization Online 2018
[23] O. El Balghiti, A. Elmachtoub, P. Grigas, A. Tewari, Generalization Bounds in the Predict-then-Optimize Framework, NeurIPS 2019
[24] X. Chen, Y. Zhang, C. Reisinger, L. Song, Understanding Deep Architecture with Reasoning Layer, NeurIPS 2020
[25] P. Wang, P. Donti, B. Wilder, Z. Kolter, SATNet: Bridging Deep Learning and Logical Reasoning Using a Differentiable Satisfiability Solver, ICML, 2019
[26] R. Yonetani, T. Taniai, M. Barekatain, M. Nishimura, A. Kanezaki Path Planning Using Neural A*, ArXiv, 2020
[27] P. Swoboda, C. Rother, H.A. Alhaija, D. Kainmuller, B. Savchynskyy, A Study of Langrangean Decompositions and Dual Ascent Solvers for Graph Matching, CVPR, 2016
[28] A. Hornakova, R. Henschel, B. Rosenhahn, P. Swoboda, Lifted Disjoint Paths with Application in Multiple Object Tracking, ICML 2020
[29] J. Thapper, S. Živný, The Complexity of finite-valued CSPs, J ACM, 2016
[30] A. Shekhovtsov, C. Reinbacher, G. Graber and T. Pock: Solving Dense Image Matching in Real-Time using Discrete-Continuous Optimization, CVWW 2016
[31] Knöbelreiter, P., Reinbacher, C., Shekhovtsov, A., and Pock, T. End-to-end training of hybrid CNN-CRF models for stereo. CVPR 2017




Dr. Bogdan Savchynskyy