Last Updated: 05/13/2024 @ 11:00:23
Iโm:
currently a computational scientist at Argonne National Laboratory (ALCF).
generally interested in the application of AI + HPC to computational problems in science1,
usually working on scaling large (language, vision, multi-modal) models across thousands of GPUs ๐
- If this sounds like something youโd be interested in doing, please feel free to reach out to me
๐ How I got here
My current research focuses on using deep generative modeling to help build better sampling algorithms in lattice gauge theory. In particular, Iโm interested in building gauge equivariant neural network architectures and using inductive priors to incorporate physical symmetries into machine learning models.
I received my PhD in Physics from the University of Iowa in 2019 and my thesis was on Learning Better Physics: A Machine Learning Approach to Lattice Gauge Theory. Prior to this, I completed two bachelors degrees (Engineering Physics and Applied Mathematics, 2015) at The University of Illinois at Urbana-Champaign. My undergraduate dissertation was titled Energy Storage in Quantum Resonators and was supervised by Professor Alfred Hรผbler within the Center for Complex Systems Research at UIUC2.
๐ค What I work on
As a member of the AI / ML Group at ALCF, I work on:
Building new parallelism techniques for efficient scaling
Generative modeling (esp. for physical systems)
Intro to HPC Bootcamp: Engaging New Communities Through Energy Justice Projects, Journal of Computational Science, 2024
Thorough Characterization and Analysis of Large Transformer Model Training At-Scale, Proc. ACM Meas. Anal. Comput. Syst. March 2024
MLMC: Machine Learning Monte Carlo for Lattice Gauge Theory, S. Foreman, X.Y. Jin, & J.C. Osborn, Lattice, 2023 (Proceedings), Dec 2023
Protein Generation via Genome-scale Language Models with Bio-physical Scoring, SCโ23, Nov 2023
DeepSpeed4Science Initiative: Enabling Large-Scale Scientific Discovery [โฆ], NeurIPS 2023 AI For Science Workshop, Oct 2023 Comprehensive Performance Study of LLMs on Novel AI Accelerators, M. Emani, S. Foreman, et al., IPDPS 2024, Oct 2023
Exploratory Analysis of Climate Data with
ClimRR
, S. Foreman, Intro to HPC Bootcamp @ NERSC, August 7, 2023๐ GenSLMs: Genome-scale language models reveal SARS-CoV-2 evolutionary dynamics, Oct 2022
Lattice QCD and Particle Physics, A.S. Kronfeld et al., July 15, 2022
Applications of ML to Lattice QFT, arXiv:2202.05838, D. Boyda, S. Calรญ, S. Foreman, et al., Feb 2022
LeapFrogLayers: Trainable Framework for Effective Sampling, S. Foreman, X.Y. Jin, J.C. Osborn, Lattice, 2021
HMC with Normalizing Flows, slides, S. Foreman et al., Lattice, 2021
Deep Learning Hamiltonian Monte Carlo (+ poster), S. Foreman, X.Y. Jin, & J.C. Osborn, @ SimDL Workshop @ ICLR, 2021
Machine Learning and Neural Networks for Field Theory, S. Foreman, X.Y. Jin, & J.C. Osborn, SnowMass, 2020
Examples of renormalization group transformations for image sets, S. Foreman et al., Physical Review E., 2018
RG inspired Machine Learning for lattice field theory S. Foreman et al., arXiv:1710.02079, 2017
Large Energy Density in Three-Plate Nanocapacitors due to Coulomb Blockade, S. Foreman et al., J. Appl. Phys, 2018
๐ชง Creating Small(-ish) LLMs @ LLM Tutorial Workshop (1), (
11/2023
)๐ชง LLM Lunch Talk @ ALCF Hands On HPC Workshop, (
10/2023
)๐ชง Generative Modeling and Efficient Sampling, @ PASC23, (
07/2023
)๐ชง Large Scale Training, @ AI-4-Science on Supercomputers, (
02/2023
)๐ชง Statistical Learning, @ ATPESC 2022, (
08/2022
) ๐ accompanying notebook๐ชง Scientific Data Science: An Emerging Symbiosis @ Argonne National Laboratory, (
05/2022
)๐ชง Machine Learning in HEP, @ UNC Greensboro, (
03/2022
)๐ชง Training Topological Samplers for Lattice Gauge Theory @ ML4HEP @ ECT* Trento, (
09/2021
)๐ชง l2hmc-qcd @ MIT Lattice Group Seminar, (
2021
)๐ชง Machine Learning for Lattice QCD @ University of Iowa, (
2020
)- Machine Learning Analysis of Ising Worms @ Brookhaven National Laboratory, (
2017
)
-
ezpz
๐ [web] -
Distributed training,
ezpz
-
Megatron-DeepSpeed
๐ค - For the largest of large language models.
-
wordplay
๐ฎ๐ฌ [web] -
[
nanoGPT
]3+{๐ค
datasets
,DeepSpeed
}
-
ai-science-training-series
๐จ๐ปโ๐ซ [web] - Lecture series, โAI-4-Science-on-Supercomputersโ
-
enrich
๐ธ [web] -
Pythonโs
logging
, with Rich -
ambivalent
๐คท๐ปโโ๏ธ [web] - Clean, simple style for Matplotlib figures4.
-
l2hmc-qcd
๐ฒ [web] - Accelerated samplers for Lattice QCD
-
climate-analysis
๐ [web] - Climate Analysis project using ClimRR data
GitHub Stats
Organizer for SC23 Workshop: High Performance Python for Science at Scale (HPPSS), November 2023
Organizer for Machine Learning and Quantum Computing for Earth Sciences at 17th U. S. National Congress on Computational Mechanics, July 2023
Assistant Computational Scientist | ALCF | 2022 | โ | |
Postdoc | ALCF | 2019 | 2022 | |
Graduate Researcher | ANL | 2018 | 2019 |
Footnotes
Citation
@online{foreman,
author = {Foreman, Sam},
title = {Sam {Foreman}},
url = {https://samforeman.me},
langid = {en}
}