Mean field methods in high-dimensional statistics and nonconvex optimization
Andrea Montanari
Starting in the seventies, physicists have introduced a class of random energy functions and corresponding random probability distributions (Gibbs measures), that are known as mean-field spin glasses. Over the years, it has become increasingly clear that a broad array of canonical models in random combinatorics and (more recently) high-dimensional statistics are in fact examples of mean field spin glasses, and can be studied using tools developed in that area.
Crucially, these new application domains have brought up a number of interesting new questions that were not central from the viewpoint of statistical physics. These lectures will focus on these new questions:
(i) Statistical questions: what is the accuracy or uncertainty associated to a certain statistical method?
(ii) Computational questions: can we efficiently compute marginals of a Gibbs measure? Can we generate low-energy configurations?
The following is a rough outline of the lectures:
1) High-dimensional statistics. General setting and key questions.
The role of sharp asymptotics. Examples and general phenomena.
2) Message passing algorithms, and approximate message passing (AMP). Sharp analysis of AMP.
3) Optimal AMP algorithms. Connection with Bayes error. Connection with convex optimization.
4) Replica symmetry breaking. Parisi formula. Computational implications aspect
5) Optimization algorithms for mean field spin glasses.
This course will be accompanied by exercise sessions.