From Word Prediction to Complex Skills: Data Flywheels for Mathematical Reasoning

CMSA Room G10 CMSA, 20 Garden Street, Cambridge, MA, United States

https://youtu.be/OYOuSAAE7QQ New Technologies in Mathematics Seminar Speaker: Anirudh Goyal (University of Montreal) Title: From Word Prediction to Complex Skills: Data Flywheels for Mathematical Reasoning Abstract: This talk examines how large language models (LLMs) evolve from simple word prediction to complex skills, with a focus on mathematical problem solving. A major driver of AI products today is the […]

How Far Can Transformers Reason? The Globality Barrier and Inductive Scratchpad

CMSA Room G10 CMSA, 20 Garden Street, Cambridge, MA, United States

https://youtu.be/C6NDdnSaluU New Technologies in Mathematics Seminar Speaker: Aryo Lotfi (EPFL) Title: How Far Can Transformers Reason? The Globality Barrier and Inductive Scratchpad Abstract: Can Transformers predict new syllogisms by composing established ones? More generally, what type of targets can be learned by such models from scratch? Recent works show that Transformers can be Turing-complete in terms of […]

Is Behavior Cloning All You Need? Understanding Horizon in Imitation Learning

CMSA Room G10 CMSA, 20 Garden Street, Cambridge, MA, United States

https://youtu.be/KOgh-FFDlvg New Technologies in Mathematics Seminar Speaker: Dylan Foster, Microsoft Research Title: Is Behavior Cloning All You Need? Understanding Horizon in Imitation Learning Abstract: Imitation learning (IL) aims to mimic the behavior of an expert in a sequential decision making task by learning from demonstrations, and has been widely applied to robotics, autonomous driving, and autoregressive language […]

Frontier of Formal Theorem Proving with Large Language Models: Insights from the DeepSeek-Prover Series

Virtual

https://youtu.be/qC60ZgsIFvk New Technologies in Mathematics Seminar Speaker: Huajian Xin, DeepSeek Title: Frontier of Formal Theorem Proving with Large Language Models: Insights from the DeepSeek-Prover Series Abstract: Recent advances in large language models have markedly influenced mathematical reasoning and automated theorem proving within artificial intelligence. Yet, despite their success in natural language tasks, these models face notable obstacles […]

Thinking Like Transformers – A Practical Session

Virtual

New Technologies in Mathematics Seminar Speaker: Gail Weiss, EPFL Title: Thinking Like Transformers - A Practical Session Abstract: With the help of the RASP programming language, we can better imagine how transformers---the powerful attention based sequence processing architecture---solve certain tasks. Some tasks, such as simply repeating or reversing an input sequence, have reasonably straightforward solutions, […]

Can Transformers Reason Logically? A Study in SAT-Solving

CMSA Room G10 CMSA, 20 Garden Street, Cambridge, MA, United States

https://youtu.be/o7uac6DuzcQ New Technologies in Mathematics Seminar Speaker: Leyan Pan, Georgia Tech Title: Can Transformers Reason Logically? A Study in SAT-Solving Abstract: Transformer-based LLMs have apparently demonstrated capabilities that resembles human reasoning. In our recent work, we investigated the Boolean reasoning abilities of decoder-only Transformers equipped with Chain-of-Thought, establishing that a Transformer model can decide all […]