The CMSA Members’ Seminar will occur every Friday at 9:30am ET on Zoom. All CMSA postdocs/members are required to attend the weekly CMSA Members’ Seminars, as well as the weekly CMSA Colloquium series. Please email the seminar organizers to obtain a link. This year’s seminar is organized by Tianqi Wu. The Schedule will be updated below.
Previous seminars can be found here.
|9/11/2020||Moran Koren||Title: Observational Learning and Inefficiencies in Waitlists |
Abstract: Many scarce resources are allocated through waitlists without monetary transfers. We consider a model, in which objects with heterogeneous qualities are offered to strategic agents through a waitlist in a first-come-first-serve manner. Agents, upon receiving an offer, accept or reject it based on both a private signal about the quality of the object and the decisions of agents ahead of them on the list. This model combines observational learning and dynamic incentives, two features that have been studied separately. We characterize the equilibrium and quantify the inefficiency that arises due to herding and selectivity. We find that objects with intermediate expected quality are discarded while objects with a lower expected quality may be accepted. These findings help in understanding the reasons for the substantial discard rate of transplant organs of various qualities despite the large shortage of organ supply.
|9/18/2020||Michael Douglas||Title: A talk in two parts, on strings and on computers and math|
Abstract: I am dividing my time between two broad topics. The first is string theory, mostly topics in geometry and compactification. I will describe my current work on numerical Ricci flat metrics, and list many open research questions. The second is computation and artificial intelligence. I will introduce transformer models (Bert,GPT) which have led to breakthroughs on natural language processing, describe their potential for helping us do math, and sketch some related theoretical problems.
|9/25/2020||Cancelled – Math Science Lecture|
|10/2/2020||Cancelled – Math Science Lecture|
|10/9/2020||Wai Tong (Louis) Fan||Title: Stochastic PDE as scaling limits of interacting particle systems|
Abstract: Interacting particle models are often employed to gain understanding of the emergence of macroscopic phenomena from microscopic laws of nature. These individual-based models capture fine details, including randomness and discreteness of individuals, that are not considered in continuum models such as partial differential equations (PDE) and integral-differential equations. The challenge is how to simultaneously retain key information in microscopic models as well as efficiency and robustness of macroscopic models.
In this talk, I will discuss how this challenge can be overcome by elucidating the probabilistic connections between particle models and PDE. These connections also explain how stochastic partial differential equations (SPDE) arise naturally under a suitable choice of level of detail in modeling complex systems. I will also present some novel scaling limits including SPDE on graphs and coupled SPDE. These SPDE not only interpolate between particle models and PDE, but also quantify the source and the order of magnitude of stochasticity. Scaling limit theorems and new duality formulas are obtained for these SPDE, which connect phenomena across scales and offer insights about the genealogies and the time-asymptotic properties of the underlying population dynamics. Joint work with Rick Durrett.
|10/16/2020||Tianqi Wu||Title: Koebe circle domain conjecture and the Weyl problem in hyperbolic 3-space|
Abstract: In 1908, Paul Koebe conjectured that every open connected set in the plane is conformally diffeomorphic to an open connected set whose boundary components are either round circles or points. The Weyl problem, in the hyperbolic setting, asks for isometric embedding of surfaces of curvature at least -1 into the hyperbolic 3-space. We show that there are close relationships among the Koebe conjecture, the Weyl problem and the work of Alexandrov and Thurston on convex surfaces. This is a joint work with Feng Luo.
|10/23/2020||Changji Xu||Title: Random Walk Among Bernoulli Obstacles|
Abstract: Place an obstacle with probability $1 – p$ independently at each vertex of $\mathbb Z^d$ and consider a simple symmetric random walk that is killed upon hitting one of the obstacles. This is called random walk among Bernoulli obstacles. The most prominent feature of this model is a strong localization effect: the random walk will be localized in a very small region conditional on the event that it survives for a long time. In this talk, we will discuss some recent results about the behaviors of the conditional random walk, in quenched, annealed, and biased settings.
|10/30/2020||Michael Simkin||Title: The differential equation method in Banach spaces and the $n$-queens problem|
Abstract: The differential equation method is a powerful tool used to study the evolution of random combinatorial processes. By showing that the process is likely to follow the trajectory of an ODE, one can study the deterministic ODE rather than the random process directly. We extend this method to ODEs in infinite-dimensional Banach spaces.
We apply this tool to the classical $n$-queens problem: Let $Q(n)$ be the number of placements of $n$ non-attacking chess queens on an $n \times n$ board. Consider the following random process: Begin with an empty board. For as long as possible choose, uniformly at random, a space with no queens in its row, column, or either diagonal, and place on it a queen. We associate the process with an abstract ODE. By analyzing the ODE we conclude that the process almost succeeds in placing $n$ queens on the board. Furthermore, we can obtain a complete $n$-queens placement by making only a few changes to the board. By counting the number of choices available at each step we conclude that $Q(n) \geq (n/C)^n$, for a constant $C>0$ associated with the ODE. This is optimal up to the value of $C$.
|11/6/2020||Kenji Kawaguchi||Title: Deep learning: theoretical results on optimization and mixup|
Abstract: Deep neural networks have achieved significant empirical success in many fields, including the fields of computer vision, machine learning, and artificial intelligence. Along with its empirical success, deep learning has been theoretically shown to be attractive in terms of its expressive power. However, the theory of the expressive power does not ensure that we can efficiently find an optimal solution in terms of optimization, robustness, and generalization, during the optimization process of a neural network. In this talk, I will discuss some theoretical results on optimization and the effect of mixup on robustness and generalization.
|11/13/2020||Omri Ben-Eliezer||Title: Sampling in an adversarial environment|
Abstract: How many samples does one need to take from a large population in order to truthfully “represent” the population? While this cornerstone question in statistics is very well understood when the population is fixed in advance, many situations in modern data analysis exhibit a very different behavior: the population interacts with and is affected by the sampling process. In such situations, the existing statistical literature does not apply.
We propose a new sequential adversarial model capturing these situations, where future data might depend on previously sampled elements; we then prove uniform laws of large numbers in this adversarial model. The results, techniques, and applications reveal close connections to various areas in mathematics and computer science, including VC theory, discrepancy theory, online learning, streaming algorithms, and computational geometry.
Based on joint works with Noga Alon, Yuval Dagan, Shay Moran, Moni Naor, and Eylon Yogev.
|11/20/2020||Charles Doran||Title: The Calabi-Yau Geometry of Feynman Integrals|
Abstract: Over the past 30 years Calabi-Yau manifolds have proven to be the key geometric structures behind string theory and its variants. In this talk, I will show how the geometry and moduli of Calabi-Yau manifolds provide a new framework for understanding and computing Feynman integrals. An important organizational principle is provided by mirror symmetry, and specifically the DHT mirror correspondence. This is joint work with Andrey Novoseltsev and Pierre Vanhove.