Bayesian Modeling and Inference Course

Dr. Melih Kandemir
Tuesdays, 9:30-11:15, Speyererstr 6, Room: G2.09
Spring 2014

Announcement

  • The project topics are released! Please see below.

Course Description

In the course, Bayesian approaches to pattern recognition and data modeling problems will be covered at the introductory level. The Bayesian modeling framework will be presented from a probability-theoretic point of view, and various model inference, checking, and selection techniques will be explained.

Schedule (dd/mm)

  • 15/04: Bayesian framework basics [PDF]
  • 22/04: Priors and conjugacy [PDF]
  • 29/04: No lecture
  • 06/05: Performance measures [PDF], Graphical models[PDF] (by C. Bishop)
  • 13/05: Markov Chain Monte Carlo Inference [PDF] [Code]
  • 20/05: Variational Inference (VI) (Bishop Ch. 10)
  • 27/05: Model Selection [PDF]
  • 03/06: Gaussian Processes [PDF]
  • 10/06: Bayesian factor analysis [PDF]
  • 17/06: Dirichlet processes and Bayesian topic models [PDF] (by J.Y.Bak)
  • 24/06: Deep learning with Boltzmann machines
  • 01/07: Nonconjugate inference: Hybrid Monte Carlo and Stochastic Variational Inference
    • Hamiltonian dynamics and MCMC (R. Neal) [PDF]
    • Riemann manifold Langevin and Hamiltonian Monte Carlo (M. Girolami, B. Calderhead) [PDF]
    • VI with stochastic search (J. Paisley et al.)[PDF]
    • Bayesian logistic regression using VI with stochastic search [Code]
  • 08/07: Project deadline

Recommended Text

  • Chris Bishop, Pattern Recognition and Machine Learning, Springer, 2006 [Amazon]
  • David Barber, Bayesian Reasoning and Machine Learning, Cambridge University Press, 2012. [Amazon]

Project

Students will be graded according to a term project. They will be provided a list of simple machine learning problems together with benchmark data sets. Each student will choose one of these problems, or propose a comparable alternative, devise a suitable Bayesian model for the problem, choose an inference method, derive its equations, implement it, and compute the results on the provided data set. At the end of the semester, the students will hand in a project report including the devised model, its mathematical details and the obtained results, together with the related source code.

Project Topics

  • Multioutput Bayesian Classification: The task is to develop a Bayesian binary classifier that is able to predict more than one outputs given an observation. You can test your algorithm on any of the data sets provided in the Mulan database. A plausible choice would be the Emotions data set.
  • Multiclass Bayesian Logistic Regression: Extend the binary Bayesian logistic regression classifier to multiple classes. You can test your algorithm on the Iris data set and/or the Wine data set.
  • Multiple Instance Bayesian Logistic Regression: Extend the binary Bayesian logistic regression classifier to the multiple instance learning setup. You can test your algorithm on the Musk 1 data set.
  • Bayesian Kernel PCA: Kernelize the Bayesian PCA. You can test your algorithm on the Iris data set.
  • Multiple Kernel Relevance Vector Machines (RVMS): Extend the standard RVM to linearly combine multiple kernels. Learning the kernel hyperparameters is not required. You can test your algorithm on the Flowers 17 data set.
  • Dirichlet process mixture of Bayesian logistic regressors: Develop an infinite mixture of Bayesian linear classifiers using Dirichlet process priors. You can test your algorithm on the Arrhythmia data set or the ISOLET data set.

Office Hours

  • Dr. Melih Kandemir
    Speyerer str. 6, Room: G 2.02
    Wednesdays 14:00-15:00
    name [dot] surname [at] iwr [dot] uni-heidelberg [dot] de