Optimization for Machine Learning, WS2018/19

Dr. Bogdan Savchynskyy, WiSe 2018/19

The lecture on November 21 at 11am is moved to December 12, 9 am!

Summary

The course presents various existing optimization techniques for such important machine learning tasks, as inference and learning for graphical models and neural networks. In particular, it addresses such topics as combinatorial algorithms, integer linear programs, scalable convex and non-convex optimization and convex duality theory. Graphical models and neural networks play a role of working examples along the course. The goal of the course is to give a strong background for analysis of existing, and development of new scalable optimization techniques for machine learning problems.

Schedule and Information

The lectures and exercises will be given in English.
Venue: Mathematikon B: Berliner Str. 43, 3rd floor, seminar room B128. Simply ring the bell "HCI am IWR" to open the front door.

  • Lecture: Tue, 11:15 – 12:45, Mathematikon B, Berliner Str. 43, SR B128, the first lecture will be given on October 16
  • Lecture: Wed, 11:15 – 12:45, Mathematikon B, Berliner Str. 43, SR B128
  • Exercises: Wed, 09:15 – 10:45, Raum: Mathematikon B, Berliner Str. 43, SR B128, the first exercise will take place on October 24

Contact for lectures: Dr. Bogdan Savchynskyy
Contact for exercises: Stefan Haller

The seminar Combinatorial Optimization in Computer Vision and Machine Learning complements this lecture by taking a closer look at recent results and developments. We highly recommend it to all students interested in the topic.

Registration

Please register for the course in Müsli.

Course Material

Lecture notes, slides and exercise sheets are available for download.

Thanks to Bálint Csanády, for some lectures there exists a video recording. We will update this page as soon as they become available. Use the username "optml" when prompted.

Exercises

Topics of next exercise (2018-12-05): discussion of solutions exercise sheet 2; intro subgradient and coordinate descent for MAP-inference (exercise sheet 3)

Deadline submission exercise 3: 2019-01-13

Table of Contents

I Graphical Models

  • Acyclic Graphical Models. Dynamic Programming
  • Background: Basics of Linear Programs and Their Geometry
  • Inference in Graphical Models as Integer Linear Program
  • Background: Basics of Convex Analysis and Convex Duality
  • Duality of the LP Relaxation of Inference Problem
  • Background: Basics of Convex Optimization
  • Sub-Gradient and Block-Coordinate Ascent for Inference in Graphical Models
  • Lagrangian (Dual) Decomposition
  • Min-Cut/Max-Flow Based Inference
  • LP Relaxation of Inference Problem as st-Min-Cut Problem
  • Summary: Inference Algorithm Selection

II Neural Networks

  • Overview of Architectures
  • Stochastic Gradient for Training Neural Networks

III Joint Learning of Graphical Models and Neural Networks

  • Structured Risk Minimization for Graphical Models
  • CRF+CNN Models: Joint Training of Graphical Models and Neural Networks.