About Me
I’m currently an Assistant Computational Scientist working in the data science group at the Leadership Computing Facility at Argonne National Laboratory.
I’m generally interested in the application of machine learning to computational problems in physics, particularly within the context of high performance computing. My current research focuses on using deep generative modeling to help build better sampling algorithms in lattice gauge theory. In particular, I’m interested in building gauge equivariant neural network architectures and using inductive priors to incorporate physical symmetries into machine learning models.
I received my PhD in Physics from the University of Iowa in 2019 and my thesis was on Learning Better Physics: A Machine Learning Approach to Lattice Gauge Theory. Prior to this, I completed two bachelors degrees (Engineering Physics and Applied Mathematics, 2015) from The University of Illinois at UrbanaChampaign. My undergraduate dissertation was titled Energy Storage in Quantum Resonators and was supervised by Professor Alfred Hübler within the Center for Complex Systems Research at UIUC.
Recent Work

M. Zvyagin, A. Brace, K. Hippe, et. al., GenSLMs: Genomescale language models reveal SARSCoV2 evolutionary dynamics, Oct 2022

A.S. Kronfeld et al. Lattice QCD and Particle Physics, 15 Jul 2022

D. Boyda, Salvatore Calí, S. Foreman, et al., Applications of Machine Learning to Lattice Quantum Field Theory arXiv:2202.05838, Feb 2022

S. Foreman, X.Y. Jin, J.C. Osborn, LeapFrogLayers: Trainable Framework for Effective Topological Sampling, slides, Lattice, 2021

S. Foreman L. Jin, X.Y. Jin, A. Tomiya, J.C. Osborn, & T. Izubuchi, HMC with Normalizing Flows, slides, Lattice, 2021

S. Foreman, X.Y. Jin, & J.C. Osborn, Deep Learning Hamiltonian Monte Carlo (+ poster) at SimDL Workshop @ ICLR, 2021

S. Foreman, X.Y. Jin, & J.C. Osborn, Machine Learning and Neural Networks for Field Theory SnowMass, 2020

S. Foreman Y. Meurice, J. Giedt & J. UnmuthYockey Examples of renormalization group transformations for image sets Physical Review E., 2018

S. Foreman, J. Giedt, Y. Meurice, & J. UnmuthYockey RG inspired Machine Learning for lattice field theory arXiv:1710.02079, 2017

S. Foreman, J. Liu, & L. Wortsmann Large Energy Density in ThreePlate Nanocapacitors due to Coulomb Blockade J. Appl. Phys, 2018
Invited Talks

Generative Modeling and Efficient Sampling, at PASC23, July 2023

Efficient Sampling for Lattice Gauge Theory, at Deep Fridays @ U. Bologna, April 2023

Large Scale Training, at Introduction to AIdriven Science on Supercomputers: A Student Training Series, November 2022

Hyperparameter Management, at 2022 ALCF Simulation, Data, and Learning Workshop, October 2022

Statistical Learning, at ATPESC 2022, August 2022 📕 accompanying notebook

Scientific Data Science: An Emerging Symbiosis, at Argonne National Laboratory, May 2022

Machine Learning in HEP, at UNC Greensboro, March 2022

Accelerated Sampling Methods for Lattice Gauge Theory, at BNLHET & RBRC Joint Workshop “DWQ @ 25”, Dec 2021

Training Topological Samplers for Lattice Gauge Theory, ML4HEP, on and off the Lattice @ ECT* Trento, Sep 2021

l2hmcqcd at the MIT Lattice Group Seminar, 2021

Deep Learning HMC for Improved Gauge Generation to the Machine Learning Techniques in Lattice QCD Workshop, 2021

Machine Learning for Lattice QCD at the University of Iowa, 2020

Machine learning inspired analysis of the Ising model transition to Lattice, 2018

Machine Learning Analysis of Ising Worms at Brookhaven National Laboratory, 2017