Dr. Galen Reeves, Duke Univ.
Inference and learning in high dimensions: Everything and nothing is Gaussian
This talk will discuss how the computational and statistical limits of some high-dimensional inference and learning problems can be described explicitly in terms of variational formulas for quantities such as entropy and mean-square error. This line of work builds upon ideas from multiple disciplines, including probability theory, information theory, and statistical physics. These concepts will be illustrated in the following contexts:
1) Dynamics of learning in shallow networks via the hidden manifold model.
2) Scalable measures of dependence via sliced mutual information.
3) Methods for pairwise observation models arising in covariance estimation, clustering, and community detection.