Sam Foreman’s personal website
Author

Sam Foreman

Published

July 10, 2025

Modified

July 10, 2025

Sam Foreman

👋 Hi, I’m Sam!

import datetime
from rich import print
now = datetime.datetime.now()
print(
    ' '.join([
        "[#838383]Last Updated[/]:",
        f"[#E599F7]{now.strftime("%Y-%m-%d")}[/]",
        "[#838383]@[/]",
        f"[#00CCFF]{now.strftime("%H:%M:%S")}[/]", 
    ])
)
Last Updated: 2025-07-10 @ 12:51:51

hits

© Copyright Sam Foreman

You can find a full list of my publications on my Google Scholar

References

Boyda, Denis, Salvatore Calı̀, Sam Foreman, Lena Funcke, Daniel C Hackett, Yin Lin, Gert Aarts, et al. 2022. “Applications of Machine Learning to Lattice Quantum Field Theory.” arXiv Preprint arXiv:2202.05838. https://arxiv.org/abs/2202.05838.
Cheng, Scott, Jun-Liang Lin, Murali Emani, Siddhisanket Raskar, Sam Foreman, Zhen Xie, Venkatram Vishwanath, and Mahmut Taylan Kandemir. 2024. “Thorough Characterization and Analysis of Large Transformer Model Training at-Scale.” Proceedings of the ACM on Measurement and Analysis of Computing Systems 8 (1): 1–25.
Deamont, George, and Sam Foreman. 2014. “Superconductivity of in and Sn Samples.”
Dharuman, Gautham, Kyle Hippe, Alexander Brace, Sam Foreman, Väinä Hatanpää, Varuni K Sastry, Huihuo Zheng, et al. 2024. “MProt-DPO: Breaking the ExaFLOPS Barrier for Multimodal Protein Design Workflows with Direct Preference Optimization.” In 2024 SC24: International Conference for High Performance Computing, Networking, Storage and Analysis SC, 74–86. IEEE Computer Society.
Dharuman, Gautham, Logan Ward, Heng Ma, Priyanka V Setty, Ozan Gokdemir, Sam Foreman, Murali Emani, et al. 2023. “Protein Generation via Genome-Scale Language Models with Bio-Physical Scoring.” In Proceedings of the SC’23 Workshops of the International Conference on High Performance Computing, Network, Storage, and Analysis, 95–101.
Emani, Murali, Sam Foreman, Varuni Sastry, Zhen Xie, Siddhisanket Raskar, William Arnold, Rajeev Thakur, Venkatram Vishwanath, and Michael E Papka. 2023. “A Comprehensive Performance Study of Large Language Models on Novel AI Accelerators.” arXiv Preprint arXiv:2310.04607. https://arxiv.org/abs/2310.04607.
Foreman, Sam, Joel Giedt, Yannick Meurice, and Judah Unmuth-Yockey. 2018. “RG-Inspired Machine Learning for Lattice Field Theory.” In EPJ Web of Conferences, 175:11025. EDP Sciences.
Foreman, Sam, Taku Izubuchi, Luchang Jin, Xiao-Yong Jin, James C Osborn, and Akio Tomiya. 2021. “HMC with Normalizing Flows.” arXiv Preprint arXiv:2112.01586. https://arxiv.org/abs/2112.01586.
Foreman, Sam, Xiao-Yong Jin, and James Osborn. “MLMC: Machine Learning Monte Carlo for Lattice Gauge Theory.” In 40th International Symposium on Lattice Field Theory (Lattice 2023) (Batavia, IL, United States, 07/31/2023 - 08/04/2023).
Foreman, Sam, Xiao-Yong Jin, and James C Osborn. 2020. “Machine Learning and Neural Networks for Field Theory.”
Foreman, Samuel Alfred. 2019. “Learning Better Physics: A Machine Learning Approach to Lattice Gauge Theory.” PhD thesis, University of Iowa.
Foreman, Samuel, Joel Giedt, Yannick Meurice, and Judah Unmuth-Yockey. 2018. “Examples of Renormalization Group Transformations for Image Sets.” Physical Review E 98 (5): 052129.
Foreman, S., X. y. Jin, and J. Osborn. 2022. LeapfrogLayers: A Trainable Framework for Effective Topological Sampling.” In The 38th International Symposium on Lattice Field Theory, 508. https://doi.org/10.22323/1.396.0508.
Gokdemir, Ozan, Carlo Siebenschuh, Alexander Brace, Azton Wells, Brian Hsu, Kyle Hippe, Priyanka V. Setty, et al. 2025. “HiPerRAG: High-Performance Retrieval Augmented Generation for Scientific Insights.” https://arxiv.org/abs/2505.04846.
Hubler, A, S Foreman, J Liu, and L Wortsmann. 2018. “Large Energy Density in Three-Plate Nanocapacitors Due to Coulomb Blockade.” Journal of Applied Physics 123 (10).
Kronfeld, Andreas S, Tanmoy Bhattacharya, Thomas Blum, Norman H Christ, Carleton DeTar, William Detmold, Robert Edwards, et al. 2022. “Lattice QCD and Particle Physics.” arXiv Preprint arXiv:2207.07641. https://arxiv.org/abs/2207.07641.
Liu, Jiaqi, Alfred W Hubler, Samuel Alfred Foreman, and Katharina Ott. 2017. “Energy Storage in Quantum Resonators.”
Parete-Koon, Suzanne, Michael Sandoval, Kellen Leland, Subil Abraham, Mary Ann Leung, Rebecca Hartman-Baker, Paige Kinsley, et al. 2024. “Intro to HPC Bootcamp: Engaging New Communities Through Energy Justice Projects.” Journal of Computational Science Education 15 (1).
Shanahan, Phiala, Kazuhiro Terao, and Daniel Whiteson. 2022. “Snowmass 2021 Computational Frontier CompF03 Topical Group Report: Machine Learning.” arXiv Preprint arXiv:2209.07559. https://arxiv.org/abs/2209.07559.
Song, Shuaiwen Leon, Bonnie Kruft, Minjia Zhang, Conglong Li, Shiyang Chen, Chengming Zhang, Masahiro Tanaka, et al. 2023. “DeepSpeed4Science Initiative: Enabling Large-Scale Scientific Discovery Through Sophisticated AI System Technologies.” arXiv Preprint arXiv:2310.04610. https://arxiv.org/abs/2310.04610.
Yan, Xiaoli, Nathaniel Hudson, Hyun Park, Daniel Grzenda, J. Gregory Pauloski, Marcus Schwarting, Haochen Pan, et al. 2025. “MOFA: Discovering Materials for Carbon Capture with a GenAI- and Simulation-Based Workflow.” https://arxiv.org/abs/2501.10651.
Zvyagin, Maxim, Alexander Brace, Kyle Hippe, Yuntian Deng, Bin Zhang, Cindy Orozco Bohorquez, Austin Clyde, et al. 2023. “GenSLMs: Genome-Scale Language Models Reveal SARS-CoV-2 Evolutionary Dynamics.” The International Journal of High Performance Computing Applications 37 (6): 683–705.

Convert from HTML to slideshow version of a page by appending /slides to the end of its URL, e.g.

📆 2025

📆 2024

📆 2023

📆 2022

📆 2021

l2hmc-qcd at the MIT Lattice Group Seminar, 2021

📆 2020

Tip📊 GitHub Stats

GitHub Streak

Github Contributions

wakatime-animated-weekdays

Even More !!
Wakatime

Tip📂 saforem2/

🎓 Education

🧑‍🔬 Professional Experience

  • Assistant Computational Scientist
    • Argonne National Laboratory, Argonne Leadership Computing Facility (ALCF)
    • Lemont, IL | 2022 – Present
      • Research lead on scaling large language models (LLMs) and generative AI for science on supercomputers (Aurora, Frontier, LUMI, Leonardo, …).
        • Co-lead the Models and Pretraining team of the AuroraGPT project
      • Optimize large-scale training of foundation models and language models for scientific applications.
      • Collaborate with interdisciplinary teams to enhance simulation efficiency and scalability
      • Focus on AI and HPC for scientific applications, including:
        • Training large language models on supercomputers
        • Genome scale language models (GenSLMs) for studying SARS-CoV-2 evolutionary dynamics
        • Direct Preference Optimization (DPO) for multimodal protein design workflows
        • Climate modeling and weather forecasting using foundation models
        • Developing improved sampling algorithms for lattice quantum chromodynamics (QCD)
      • https://www.alcf.anl.gov/about/people/sam-foreman
  • Postdoctoral Researcher
    • Argonne National Laboratory, Argonne Leadership Computing Facility (ALCF)
    • Lemont, IL | 2019 – 2022
      • Applied deep learning to lattice gauge theory and quantum field simulations.
      • Developed ML-enhanced Monte Carlo methods for QCD.
      • Engaged in AI-for-Science collaborations with national labs and university partners.
  • Graduate Researcher
    • Argonne National Laboratory, Math and Computer Sciences (MCS)
    • Lemont, IL | 2018 – 2019
    • Collaborated with ALCF while completing Ph.D., integrating ML into physical sciences workflows.

🏆 Awards and Honors

📚 Publications

🎤 Selected Talks

🎪 Events

👔 Employment

Table 1: 📟 Experience
📟 Experience
Position @ Start End
Assistant Computational Scientist ALCF 2022
Postdoc ALCF 2019 2022
Graduate Researcher ANL 2018 2019

🍎 School

Table 2: 🎓 Education
🎓 Education
Degree In @ End
PhD Physics University of Iowa 2019
B.Sc Physics UIUC 2015
B.Sc Math UIUC 2015

Footnotes

  1. See full list on Google Scholar↩︎

  2. See full list at: samforeman.me/talks↩︎

Citation

BibTeX citation:
@online{foreman2025,
  author = {Foreman, Sam},
  date = {2025-07-10},
  url = {https://samforeman.me/},
  langid = {en}
}
For attribution, please cite this work as:
Foreman, Sam. 2025. July 10, 2025. https://samforeman.me/.