BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CMSA - ECPv6.15.18//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://cmsa.fas.harvard.edu
X-WR-CALDESC:Events for CMSA
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20220313T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20221106T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20230312T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20231105T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20240310T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20241103T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20230412T123000
DTEND;TZID=America/New_York:20230412T133000
DTSTAMP:20260426T045722
CREATED:20230817T182227Z
LAST-MODIFIED:20240215T103145Z
UID:10001281-1681302600-1681306200@cmsa.fas.harvard.edu
SUMMARY:Unexpected Uses of Neural Networks: Field Theory and Metric Flows  
DESCRIPTION:Speaker: James Halverson (Northeastern University)\n \nTitle: Unexpected Uses of Neural Networks: Field Theory and Metric Flows\nAbstract:  We are now quite used to the idea that deep neural networks may be trained in a variety of ways to tackle cutting-edge problems in physics and mathematics\, sometimes leading to rigorous results. In this talk\, however\, I will argue that breakthroughs in deep learning theory are also useful for making progress\, focusing on applications to field theory and metric flows. Specifically\, I will introduce a neural network approach to field theory with a different statistical origin\, that exhibits generalized free field behavior at infinite width and interactions at finite width\, and that allows for the study of symmetries via the study of correlation functions in a different duality frame. Then\, I will review recent progress in approximating Calabi-Yau metrics as neural networks and cast that story into the language of neural tangent kernel theory\, yielding a theoretical understanding of neural network metric flows induced by gradient descent and recovering famous metric flows\, such as Perelman’s formulation of Ricci flow\, in particular limits.
URL:https://cmsa.fas.harvard.edu/event/colloquium12523/
LOCATION:CMSA Room G10\, CMSA\, 20 Garden Street\, Cambridge\, MA\, 02138\, United States
CATEGORIES:Colloquium
ATTACH;FMTTYPE=image/png:https://cmsa.fas.harvard.edu/media/02CMSA-Colloquium-04.12.2023.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20230202T123000
DTEND;TZID=America/New_York:20230202T133000
DTSTAMP:20260426T045722
CREATED:20230817T175011Z
LAST-MODIFIED:20240121T174936Z
UID:10001272-1675341000-1675344600@cmsa.fas.harvard.edu
SUMMARY:Neural Optimal Stopping Boundary
DESCRIPTION:Speaker: Max Reppen (Boston University) \nTitle: Neural Optimal Stopping Boundary \nAbstract:  A method based on deep artificial neural networks and empirical risk minimization is developed to calculate the boundary separating the stopping and continuation regions in optimal stopping. The algorithm parameterizes the stopping boundary as the graph of a function and introduces relaxed stopping rules based on fuzzy boundaries to facilitate efficient optimization. Several financial instruments\, some in high dimensions\, are analyzed through this method\, demonstrating its effectiveness. The existence of the stopping boundary is also proved under natural structural assumptions.
URL:https://cmsa.fas.harvard.edu/event/colloquium_2223/
LOCATION:CMSA Room G10\, CMSA\, 20 Garden Street\, Cambridge\, MA\, 02138\, United States
CATEGORIES:Colloquium
ATTACH;FMTTYPE=image/png:https://cmsa.fas.harvard.edu/media/02CMSA-Colloquium-02.02.2023.png
END:VEVENT
END:VCALENDAR