Random Matrix & Probability Theory Seminar

As the seminar will not occur on a regular weekly basis, the list below will reflect the dates of the scheduled talks. Room numbers and times will be announced as the details are confirmed


The schedule will be updated as details are confirmed.


Date………… Name……………. Title/Abstract



Reza Gheissari (NYU) Dynamics of Critical 2D Potts Models

Abstract: The Potts model is a generalization of the Ising model to $q\geq 3$ states with inverse temperature $\beta$. The Gibbs measure on $\mathbb Z^2$ has a sharp transition between a disordered regime when $\beta<\beta_c(q)$ and an ordered regime when $\beta>\beta_c(q)$. At $\beta=\beta_c(q)$, when $q\leq 4$, the phase transition is continuous while when $q>4$, the phase transition is discontinuous and the disordered and ordered phases coexist.

We will discuss recent progress, joint with E. Lubetzky, in analyzing the time to equilibrium (mixing time) of natural Markov chains (e.g., heat bath/Metropolis) for the 2D Potts model, where the mixing time on an $n \times n$ torus should transition from $O(\log n)$ at high temperatures to $\exp(c_\beta n)$ at low temperatures, via a critical slowdown at $\beta_c(q)$ that is polynomial in $n$ when $q \leq 4$ and exponential in $n$ when $q>4$.




Mustazee Rahman (MIT) On shocks in the TASEP

Abstract: The TASEP particle system runs into traffic jams when the particle density to the left is smaller than the density to the right. Macroscopically, the particle density solves Burgers’ equation and traffic jams correspond to its shocks. I will describe work with Jeremy Quastel on a specialization of the TASEP shock whereby we identify the microscopic fluctuations around the shock by using exact formulas for the correlation functions of TASEP and its KPZ scaling limit. The resulting laws are related to universal laws of random matrix theory.

For the curious, here is a video of the shock forming in Burgers’ equation:




Carlo Lucibello (Microsoft Research NE) The Random Perceptron Problem: thresholds, phase transitions, and geometry

Abstract: The perceptron is the simplest feedforward neural network model, the building block of the deep architectures used in modern machine learning practice. In this talk, I will review some old and new results, mostly focusing on the case of binary weights and random examples. Despite its simplicity, this model provides an extremely rich phenomenology: as the number of examples per synapses is increased, the system undergoes different phase transitions, which can be directly linked to solvers’ performances and to information theoretic bounds. A geometrical analysis of the solution space shows how two different types of solutions, akin to wide and sharp minima, have different generalization capabilities when presented with new examples.  Solutions in dense clusters generalize remarkably better,  partially closing the gap with Bayesian optimal estimators.  Most of the results I will present were first obtained using non rigorous techniques from spin glass theory and many of them haven’t been rigorously established yet,  although some big steps forward have been taken in recent years.



Yash Despande (MIT) Phase transitions in estimating low-rank matrices

Abstract: Low-rank perturbations of Wigner matrices have been extensively studied in random matrix theory; much information about the corresponding spectral phase transition can be gleaned using these tools. In this talk, I will consider an alternative viewpoint based on tools from spin glass theory, and two examples that illustrate how these they yield information beyond traditional spectral tools. The first example is the stochastic block model,where we obtain a full information-theoretic picture of estimation. The second example demonstrates how side information alters the spectral threshold. It involves a new phase transition that interpolates between the Wigner and Wishart ensembles.


For information on previous seminars, click here.

Comments are closed.