• Symmetry in quantum field theory and quantum gravity 2

    Speaker: Daniel Harlow (MIT) Title: Symmetry in quantum field theory and quantum gravity 2 Abstract: In this talk I will give an overview of semi-recent work with Hirosi Ooguri arguing […]

  • Hydrodynamics and multi-scale order in confluent epithelia

    Abstract: In this talk I will review our ongoing theoretical and experimental efforts toward deciphering the hydrodynamic behavior of confluent epithelia. The ability of epithelial cells to collectively flow lies at […]

  • 12/2/2021 Interdisciplinary Science Seminar

    Title: Polyhomogeneous expansions and Z/2-harmonic spinors branching along graphs Abstract: In this talk, we will first reformulate the linearization of the moduli space of Z/2-harmonic spinorsv branching along a knot. This formula […]

  • Black Holes, 2D Gravity, and Random Matrices

    Member Seminar Speaker: Dan Kapec Title: Black Holes, 2D Gravity, and Random Matrices Abstract: I will discuss old and new connections between black hole physics, 2D quantum gravity, and random matrix theory. […]

  • Extremal Black Hole Corrections from Iyer-Wald

    Abstract: Extremal black holes play a key role in our understanding of various swampland conjectures and in particular the WGC. The mild form of the WGC states that higher-derivative corrections […]

  • Induced subgraphs and tree decompositions

    Speaker: Maria Chudnovsky, Princeton Title: Induced subgraphs and tree decompositions Abstract: Tree decompositions are a powerful tool in both structural graph theory and graph algorithms. Many hard problems become tractable if the […]

  • Defects, link invariants and exact WKB

    Virtual

    Speaker: Fei Yan (Rutgers) Title: Defects, link invariants and exact WKB Abstract: I will describe some of my recent work on defects in supersymmetric field theories. The first part of my talk […]

  • Hierarchical Transformers are More Efficient Language Models

    Virtual

    https://youtu.be/soqWNyrdjkw Speaker: Piotr Nawrot, University of Warsaw Title: Hierarchical Transformers are More Efficient Language Models Abstract: Transformer models yield impressive results on many NLP and sequence modeling tasks. Remarkably, Transformers can […]