Machine Learning and Mathematical Conjecture

On April 15, 2022, the CMSA will hold a one-day workshop, Machine Learning and Mathematical Conjecture, related to the New Technologies in Mathematics Seminar Series. Organizers: Michael R. Douglas (CMSA/Stony Brook/IAIFI) and Peter Chin (CMSA/BU).

Machine learning has driven many exciting recent scientific advances. It has enabled progress on long-standing challenges such as protein folding, and it has helped mathematicians and mathematical physicists create new conjectures and theorems in knot theory, algebraic geometry, and representation theory.

At this workshop, we will bring together mathematicians, theoretical physicists, and machine learning researchers to review the state of the art in machine learning, discuss how ML results can be used to inspire, test and refine precise conjectures, and identify mathematical questions which may be suitable for this approach.

The workshop will be held in room G10 of the CMSA, located at 20 Garden Street, Cambridge, MA. For a list of lodging options convenient to the Center, please visit our recommended lodgings page.

All non-Harvard affiliated visitors to the CMSA building will need to complete this covid form prior to arrival: https://forms.gle/xKykcNcXq7ciZuvJ8

Speakers:

  • James Halverson, Northeastern University Dept. of Physics and IAIFI
  • Fabian Ruehle, Northeastern University Dept. of Physics and Mathematics and IAIFI
  • Andrew Sutherland, MIT Department of Mathematics

Schedule (Download PDF)

9:30 am – 10:20 am James HalversonTitle: Machine Learning for Mathematics

Abstract: If deep learning is powerful, but error-prone and stochastic, how are we supposed to obtain rigorous results? I’ll explore this central question in the context of applying machine learning to math, and propose ways forward based on mitigating error and /or stochasticity. Topics include human-in-the-loop conjecture generation and rigorous verification via reinforcement learning and gameplay.

VIDEO
10:30 am – 11:20 amAndrew SutherlandTitle: Number Theory

Abstract: Much of number theory is concerned with questions related to the distribution of, and correlations betwen, infinite sets of objects such as prime numbers, number fields, elliptic curves, and modular forms, and various numerical invariants one can associate to these objects.  This includes results like the Prime Number Theorem and the Chebotarev Density Thoerem, as well as the conjecture of Birch and Swinnerton-Dyer, the Cohen-Lenstra heuristics, the Sato-Tate conjecture, and many aspects of the Langlands Program.  I will give a high-level overview of some of these questions that assumes little or no background in number theory, along with an introduction to the L-functions and Modular Forms Database, which catalogs many of these objects and may provide a rich source of data for experiments in machine learning.

SLIDES
11:30 am – 12:20 pmFabian RuehleTitle: Knot Theory

Abstract: This talk covers several aspects of the interplay between machine learning and knot theory. First, a knot representation has to be chosen. This choice needs to be informed by the problem at hand, potential mathematical properties of the representation, and the choice of the machine learning (ML) algorithm.  Next, once an ML algorithm performs well on a given task, one would like to understand what the algorithm does in order to come up with new conjectures. This can be done using feature scoring techniques, which I will introduce.  After that, I will describe generative models. These can generate knots with specific properties or invariants, which can be used to substantiate or invalidate conjectures. Finally, I will discuss how reinforcement learning can be used to obtain provable results.

SLIDES

VIDEO
12:20 pm – 2:00 pmLunch Break
2:00 pm – 3:30 pm Computer demonstrations
3:45 pm – 4:45 pm Discussion