**Speaker:**

Stefan Hollands

**Coordinator:**

Jörg Lehnert

**Address:**

Inselstraße 22

04103 Leipzig

Germany

**Contact:**

+49(0)341-9959-641

+49(0)341-9959-658* ***IMPRS**

# Courses in the Summer Semester 2022

Due to the rather broad spectrum of topics within the IMPRS, the curriculum consists of a core curriculum to be attended by all students and a variety of more specialized lectures and courses. The heart of our teaching program certainly is the **Ringvorlesung**. Each semester the Ringvorlesung focuses on one field and is usually delivered by scientific members of the IMPRS who introduce different approaches and visions within this field.

### Important information

**Videos from all lectures**: Every lecture will be recorded and can be viewed later. For more info, streaming and download see the media WWW page at the MPI MIS.

## IMPRS Ringvorlesung Part I: "Topological Phases and Anyons"

### General information

**Lecturer:**Bernd Rosenow**Date and time:**- Thursday 7.4.: 09.15-10.45
- Thursday 14.4.: 09.15-10.45
- Friday 22.4.: 11.30-13.00
- Thursday 28.4.: 09.15-10.45

**Room:**MPI MiS, Leibniz-Hörsaal and also hybrid.**Audience:**IMPRS students (mandatory in their first year), Phd students, postdocs- See these lectures as stream / download

## IMPRS Ringvorlesung Part II: "From Combinatorics to Partial Differential Equations"

### General information

**Lecturer:**Felix Otto**Date and time:**Thursday 5.5. (Leibniz), Friday 6.5. (G3 10), Thursday 12.5. (Leibniz), Friday 13.5. (G3 10): 9.15-10.45**Room:**MPI MiS, Leibniz-Hörsaal or room G3 10 and also hybrid.**Audience:**IMPRS students (mandatory in their first year), Phd students, postdocs- See these lectures as stream / download

### Abstract

The optimal matching of two large point clouds is a combinatorial challenge. Optimality amounts to a matching that is cyclically monotone. Of particular interest is the case when these points are independently sampled from some given distribution, i. e. according to some given law. The matching can then be interpreted as the optimal transportation between two independent empirical measures arising from given smooth distribution. It amounts to determining their Wasserstein distance, i.e. identifying a coupling that maximizes covariance.

The nature of this optimal matching crucially depends on the dimension *d* of the ambient space, with *d=2* being critical. This is revealed by considering the standard situation of a uniform law, i. e. the matching of two independent copies of the Poisson point process. This setting was introduced by the Hungarian school (Ajtai-Komlós-Tusnády '84).

Recently, the Italian school (Parisi et. al '14, Ambrosio et. al. '19) elucidated this observation by connecting it to partial differential equations. While the connection between optimal transportation and the Monge-Ampère equation is classical, the above work draws a connection to its linearization, the (simpler) Poisson equation from electrostatics. This connects the optimal transportation between Poisson point processes to the Gaussian free field, thus clarifying the special role of *d=2*.

In this mini-course, I plan to introduce these concepts, which involves some elementary probability theory/statistics and analysis/calculus of variations.

All lectures was recorded. You can see these lectures as stream / download.

## IMPRS Ringvorlesung Part III: "Variational formulations of Bayesian Inference"

### General information

**Lecturer:**Sayan Mukherjee**Date and time:**Thursday 9.6., 16.6., 23.6., 30.6.: 09.15-10.45**Room:**MPI MiS, Leibniz-Hörsaal and also hybrid..**Audience:**IMPRS students (mandatory in their first year), Phd students, postdocs- See these lectures as stream / download

### Abstract

Bayesian updating using conditional probabilities is the classical framework for inference with uncertainty. The basic procedure in Bayesian updating is (1) given a prior over parameters and (2) data coming from a specified generative process (the likelihood) use Bayes rule to obtain the posterior distribution on the parameters which quantify one's uncertainty. This clean framework becomes challenging for infinite dimensional conditional distributions, these models are called nonparametric. In addition, settings where the generative process is misspecified or unknown (as happens in inverse problems) or for systems that are deterministic with observable noise (such as dynamical systems) the standard Bayesian framework is not clear or simple or computable.

A broader perspective on updating beliefs is replacing the Bayes rule for updating the conditional probability with a variational calculation. There are theoretical and methodological arguments for this more general variation framework that we will explore in this mini-course. We will examine examples in inverse problems, dynamical systems, a procedure called probability kinematics, as computational approximations popular in machine learning to study examples where the variational formulation offers some advantages.

This mini-course assumes some elementary probability theory/statistics and will touch on many of the ideas developed in Part II.

### Lecture notes and papers

- Script for lecture 1 (9.6.2022)
**Papers:**- N. Aronszajn: Theory of Reproducing Kernels. Transactions of the American Mathematical Society, Vol. 68, No. 3 (May, 1950), pp. 337-404
- P. Diaconis and D. Freedman: On Inconsistent Bayes Estimates of Location. The Annals of Statistics, 1986, Vol. 14, No. 1, pp. 68-87
- P. Diaconis and D. Ylvisaker: Conjugate priors for exponential families. The Annals of Statistics, 1979, Vol. 7, No 2, pp. 269-281

- Script for lecture 2 (16.6.2022)
**Papers:**- N.G. Trillos and D. Sanz-Alonso: The Bayesian update: variational formulations and gradient flows. arXiv: 1705.07382v2, 2018
- J. Backhoff-Veraguas, J. Fontbona, G. Rios, F. Tobar: Bayesian Learning with Wasserstein Barycenters. arXiv:1805.10833v4, 2018/2022
- S. Srivastava, C. Li, D.B. Dunson: Scalable Bayes via Barycenter in Wasserstein Space. Journal of Machine Learning Research 19 (2018) 1-35
- A. L. Bertozzi, B. Hosseini, H. Li, K. Miller, and A. M. Stuart: Posterior consistency of semi-supervised regression on graphs. Inverse Problems, 37 (2021), 105011

- Script for lecture 3 (23.6.2022)
**Papers:**- S. Tokdar: Asymptotics. Lectures 'STA 941' on "Bayesian Nonparametrics", Duke University, 2018
- L. Su, S. Mukherjee: A Large Deviation Approach to Posterior Consistency in Dynamical Systems. arXiv:2106.06894v1, 2021
- K. McGoff, S. Mukherjee, A. Nobel: Gibbs posterior convergence and the thermodynamic formalism. arXiv:1901.08641v1, 2019
- S. Ghosal: A review of consistency and convergence of posterior distribution. In "Proceedings of Varanashi Symposium in Bayesian Inference", Banaras Hindu University, 1996

- Script for lecture 4 (30.6.2022)
**Papers:**- Ch. Bunne, L. Meng-Papaxanthos, A. Krause, M. Cuturi: Proximal Optimal Transport Modeling of Population Dynamics. arXiv:2106.06345v4, 2021
- M. Lambert, S. Chewi, F. Bach, S. Bonnabel, P. Rigollet: Variational inference via Wasserstein gradient flows. arXiv:2205.15902v1, 2022