Felix Draxler

CS & Math PhD in Visual Learning Lab and Image & Pattern Analysis Group
Topic: Theory of Normalizing Flows
Supervisors: Ullrich Köthe and Christoph Schnörr

E-Mail: felix.draxler at iwr.uni-heidelberg.de
Twitter, GitHub, Google Scholar, ORCID

Funded by vector Stiftung (TRINN) and STRUCTURES.

Education

  • M.Sc. Physics, Heidelberg University, Germany. 2018
    • Thesis: „Energy Landscape of Neural Networks“
    • Supervisor: Fred Hamprecht, Manfred Salmhofer
    • Scholarship Max Weber-Programm by the State of Bavaria
  • B.Sc. Physics, LMU Munich. 2015
    • Thesis: „Evolutionary Optimization of a Cooling Sequence for the Generation of Ultracold Atoms“
    • Supervision: Ulrich Schneider and Immanuel Bloch 
    • Semester abroad: winter 2015/16: INP Grenoble, France
    • Scholarship Max Weber-Programm by the State of Bavaria

Publications

  • Maximum Likelihood Training of Autoencoders (preprint, 2023)
    Peter Sorrenson*, Felix Draxler*, Christoph Schnörr, Ullrich Köthe
  • Finding Competence Regions in Domain Generalization (TMLR 2023)
    Jens Müller, Stefan T. Radev, Robert Schmier, Felix Draxler, Carsten Rother, Ullrich Köthe
  • On the Convergence Rate of Gaussianization with Random Rotations (ICML 2023)
    Felix Draxler, Lars Kühmichel, Jens Müller, Armand Rousselot, Christoph Schnörr, Ullrich Köthe
  • Whitening Convergence Rate of Coupling-based Normalizing Flows (NeurIPS 2022) Oral, Scholar Award
    Felix Draxler, Christoph Schnörr, Ullrich Köthe
  • Characterizing The Role of A Single Coupling Layer in Affine Normalizing Flows (GCPR 2020) Honorable Mention
    Felix Draxler, Jonathan Schwarz, Christoph Schnörr, Ullrich Köthe
  • Riemannian SOS-Polynomial Normalizing Flows (GCPR 2020)
    Jonathan Schwarz, Felix Draxler, Christoph Schnörr, Ullrich Köthe
  • On the Spectral Bias of Neural Networks (ICML 2019) Oral
    Nasim Rahaman, Aristide Baratin, Devansh Arpit, Felix Draxler, Min Lin, Fred Hamprecht, Yoshua Bengio, Aaron Courville
  • Software library FrEIA (2018-2023)
    Package to easily build invertible neureal networks with PyTorch
    Lynton Ardizzone, Till Bungert, Felix Draxler, Ullrich Köthe, Jakob Kruse, Robert Schmier, Peter Sorrenson
  • Essentially No Barriers in Neural Network Energy Landscape (ICML 2018) Oral
    Felix Draxler, Kambis Veschgini, Manfred Salmhofer, Fred Hamprecht

Invited Talks

Whitening Convergence Rate of Coupling-based Normalizing Flows

  • Stefano Ermon lab, Stanford
  • Marcus Brubaker lab, York University, Toronto
  • Andrew Gordon Wilson lab, New York University
  • NeurIPS 2022: Oral

Characterizing The Role of A Single Coupling Layer in Affine Normalizing Flows

  • GCPR 2020: Oral
  • Math Machine Learning seminar MPI MiS + UCLA via Guido Montúfar

Essentially No Barriers in Neural Network Energy Landscape

  • ICML 2018: Long Oral
  • Aspen Winter School 2019: Theoretical Physics for Machine Learning
  • Smita Krishnaswamy lab, Yale