Stefan Hollands

Jörg Lehnert

Inselstraße 22
04103 Leipzig


Courses in the Summer Semester 2022

Due to the rather broad spectrum of topics within the IMPRS, the curriculum consists of a core curriculum to be attended by all students and a variety of more specialized lectures and courses. The heart of our teaching program certainly is the Ringvorlesung. Each semester the Ringvorlesung focuses on one field and is usually delivered by scientific members of the IMPRS who introduce different approaches and visions within this field.

Important information

  • To keep informed about changes to this lecture series subscribe to lecture mailinglist.
  • Due to the pandemic, it is strongly recommended to register for before attending one of the lectures. For this registration, we simply use the mailing list for the lecture series. So, if you plan to attend a lecture (series), please subscribe the mailing list.
  • External guests please use the main entry Inselstr. 22 and go to the 3rd. floor (reception, see the map). All other doors are closed.
  • Videos from all lectures: Every lecture will be recorded and can be viewed later. For more info, streaming and download see the media WWW page at the MPI MIS.

IMPRS Ringvorlesung Part I: "Topological Phases and Anyons"

General information

  • Lecturer: Bernd Rosenow
  • Date and time:
    • Thursday 7.4.: 09.15-10.45
    • Thursday 14.4.: 09.15-10.45
    • Friday 22.4.: 11.30-13.00
    • Thursday 28.4.: 09.15-10.45
  • Room: MPI MiS, Leibniz-Hörsaal and also hybrid (subscribe to the mailing list for more info)
  • Audience: IMPRS students (mandatory in their first year), Phd students, postdocs

IMPRS Ringvorlesung Part II: "From Combinatorics to Partial Differential Equations"

General information

  • Lecturer: Felix Otto
  • Date and time: Thursday 5.5. (Leibniz), Friday 6.5. (G3 10), Thursday 12.5. (Leibniz), Friday 13.5. (G3 10): 9.15-10.45
  • Room: MPI MiS, Leibniz-Hörsaal or room G3 10 and also hybrid (subscribe to the mailing list for more info).
  • Audience: IMPRS students (mandatory in their first year), Phd students, postdocs


The optimal matching of two large point clouds is a combinatorial challenge. Optimality amounts to a matching that is cyclically monotone. Of particular interest is the case when these points are independently sampled from some given distribution, i. e. according to some given law. The matching can then be interpreted as the optimal transportation between two independent empirical measures arising from given smooth distribution. It amounts to determining their Wasserstein distance, i.e. identifying a coupling that maximizes covariance.
The nature of this optimal matching crucially depends on the dimension d of the ambient space, with d=2 being critical. This is revealed by considering the standard situation of a uniform law, i. e. the matching of two independent copies of the Poisson point process. This setting was introduced by the Hungarian school (Ajtai-Komlós-Tusnády '84).

Recently, the Italian school (Parisi et. al '14, Ambrosio et. al. '19) elucidated this observation by connecting it to partial differential equations. While the connection between optimal transportation and the Monge-Ampère equation is classical, the above work draws a connection to its linearization, the (simpler) Poisson equation from electrostatics. This connects the optimal transportation between Poisson point processes to the Gaussian free field, thus clarifying the special role of d=2.

In this mini-course, I plan to introduce these concepts, which involves some elementary probability theory/statistics and analysis/calculus of variations.

IMPRS Ringvorlesung Part III: "A Variational Formulation of Updating Beliefs and Uncertainty"

General information

  • Lecturer: Sayan Mukherjee
  • Date and time: Thursday 9.6., 16.6., 23.6., 30.6.: 09.15-10.45
  • Room: MPI MiS, Leibniz-Hörsaal and also hybrid (subscribe to the mailing list for more info).
  • Audience: IMPRS students (mandatory in their first year), Phd students, postdocs


Bayesian updating using conditional probabilities is the classical framework for inference with uncertainty. The basic procedure in Bayesian updating is (1) given a prior over parameters and (2) data coming from a specified generative process (the likelihood) use Bayes rule to obtain the posterior distribution on the parameters which quantify one's uncertainty. This clean framework becomes challenging for infinite dimensional conditional distributions, these models are called nonparametric. In addition, settings where the generative process is misspecified or unknown (as happens in inverse problems) or for systems that are deterministic with observable noise (such as dynamical systems) the standard Bayesian framework is not clear or simple or computable.

A broader perspective on updating beliefs is replacing the Bayes rule for updating the conditional probability with a variational calculation. There are theoretical and methodological arguments for this more general variation framework that we will explore in this mini-course. We will examine examples in inverse problems, dynamical systems, a procedure called probability kinematics, as computational approximations popular in machine learning to study examples where the variational formulation offers some advantages.

This mini-course assumes some elementary probability theory/statistics and will touch on many of the ideas developed in Part II.

Other recommended lectures

19.05.2022, 11:10