CMSA Q&A Seminar: Cliff Taubes
CMSA Q&A Seminar Speaker: Cliff Taubes, Harvard Mathematics Topic: What are Z/2 harmonic 1-forms?
CMSA Q&A Seminar Speaker: Cliff Taubes, Harvard Mathematics Topic: What are Z/2 harmonic 1-forms?
https://youtu.be/x7LPDDYZn94 New Technologies in Mathematics Seminar Speaker: Antonio Sclocchi, EPFL Title: Hierarchical data structures through the lenses of diffusion models Abstract: The success of deep learning with high-dimensional data relies on the fact that natural data are highly structured. A key aspect of this structure is hierarchical compositionality, yet quantifying it remains a challenge. In […]
Mathematical Physics and Algebraic Geometry Seminar Speaker: Chuck Doran, Harvard CMSA Title: Enumerative geometry and modularity in two-modulus K3-fibered Calabi-Yau threefolds Abstract: Smooth $M_m$-polarized K3-fibered Calabi-Yau (CY) 3-folds have been classified in terms of the choice of a generalized functional invariant and, in the case $m=1$, a generalized homological invariant. The resulting geometries generally exhibit […]
Quantum Field Theory and Physical Mathematics Seminar Speaker: Giulia Fardelli, Boston University Title: Holography and Regge Phases at Large U(1) Charge Abstract: A single Conformal Field Theory (CFT) can have a rich phase diagram with qualitatively different emergent behaviors in a range of different regimes parameterized by the conserved charges of the theory. In this […]
Math and Machine Learning Program Discussion
Member Seminar Speaker: Hugo Cui, CMSA Title: High-dimensional learning of narrow neural networks Abstract: This talk explores the interplay between neural network architectures and data structure through the lens of high-dimensional asymptotics. We focus on a class of narrow neural networks, namely networks possessing a finite number of hidden units, while operating in high dimensions. In the […]
Math and Machine Learning Program Discussion
Colloquium Speaker: Elisenda Grigsby, Boston College Title: Local complexity measures in modern parameterized function classes for supervised learning Abstract: The parameter space for any fixed architecture of neural networks serves as a proxy during training for the associated class of functions - but how faithful is this representation? For any fixed feedforward ReLU network architecture, it […]
General Relativity Seminar Speaker: Oswaldo Vazquez, Northeastern University Title: Continuation of solutions of Einstein's equations Abstract: Klainerman-Rodnianski improved the continuation criterion for the solutions of Einstein's equations proved by Michael Anderson using Kirchoff-Sobolev type parametrix and geometric Littlewood-Paley theory. Using their technique but a new parametrix we prove a continuation condition in the context of […]
Topics in Deep Learning Theory Eli Grigsby
Open Discussion/Tea
Geometry and Quantum Theory Seminar Speaker: Sunghyuk Park, Harvard CMSA Title: Skein traces and curve counting Abstract: Skein modules are vector space-valued invariants of 3-manifolds describing the space of line defects modulo skein relations (determined by a choice of a ribbon category). When the 3-manifold is S x I for some surface S, the skein […]