Ongoing

Swampland Program

CMSA 20 Garden Street, Cambridge

During the 2021–2022 academic year, the CMSA will host a program on the so-called “Swampland.” The Swampland program aims to determine which low-energy effective field theories are consistent with nonperturbative quantum gravity considerations. Not […]

CMSA Colloquium 9/15/2021 – 5/25/2022

CMSA 20 Garden Street, Cambridge

During the 2021–22 academic year, the CMSA will be hosting a Colloquium, organized by Du Pei, Changji Xu, and Michael Simkin. It will take place on Wednesdays at 9:30am – 10:30am (Boston time). The […]

General Relativity Program

CMSA Room G10 CMSA, 20 Garden Street, Cambridge

During the Spring 2022 semester, the CMSA hosted a program on General Relativity. This semester-long program included four minicourses,  a conference, and a workshop. General Relativity Mincourses: March–May, 2022 General […]

General Relativity Program Minicourses

CMSA Room G10 CMSA, 20 Garden Street, Cambridge

Minicourses General Relativity Program Minicourses During the Spring 2022 semester, the CMSA hosted a program on General Relativity. This semester-long program included four minicourses running in March, April, and May;  […]

Edge Modes and Gravity

Speaker: Rob Leigh, UIUC Title: Edge Modes and Gravity Abstract:  In this talk I first review some of the many appearances of localized degrees of freedom — edge modes —  in […]

Elliptic chiral homology and chiral index

Abstract: We present an effective quantization theory for chiral deformation of two-dimensional conformal field theories. We explain a connection between the quantum master equation and the chiral homology for vertex operator […]

Renormalization group flow as optimal transport

Youtube Video   Abstract: We show that Polchinski’s equation for exact renormalization group flow is equivalent to the optimal transport gradient flow of a field-theoretic relative entropy.  This gives a surprising information-theoretic formulation of the […]

Memorizing Transformers

Virtual

https://youtu.be/5AoOpFFjW28 Speaker: Yuhuai Wu, Stanford and Google Title: Memorizing Transformers Abstract: Language models typically need to be trained or fine-tuned in order to acquire new knowledge, which involves updating their weights. […]