IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v15y2024i1d10.1038_s41467-024-50620-6.html
   My bibliography  Save this article

A Euclidean transformer for fast and stable machine learned force fields

Author

Listed:
  • J. Thorben Frank

    (TU Berlin
    Berlin Institute for the Foundations of Learning and Data)

  • Oliver T. Unke

    (Google DeepMind)

  • Klaus-Robert Müller

    (TU Berlin
    Berlin Institute for the Foundations of Learning and Data
    Google DeepMind
    Korea University)

  • Stefan Chmiela

    (TU Berlin
    Berlin Institute for the Foundations of Learning and Data)

Abstract

Recent years have seen vast progress in the development of machine learned force fields (MLFFs) based on ab-initio reference calculations. Despite achieving low test errors, the reliability of MLFFs in molecular dynamics (MD) simulations is facing growing scrutiny due to concerns about instability over extended simulation timescales. Our findings suggest a potential connection between robustness to cumulative inaccuracies and the use of equivariant representations in MLFFs, but the computational cost associated with these representations can limit this advantage in practice. To address this, we propose a transformer architecture called SO3krates that combines sparse equivariant representations (Euclidean variables) with a self-attention mechanism that separates invariant and equivariant information, eliminating the need for expensive tensor products. SO3krates achieves a unique combination of accuracy, stability, and speed that enables insightful analysis of quantum properties of matter on extended time and system size scales. To showcase this capability, we generate stable MD trajectories for flexible peptides and supra-molecular structures with hundreds of atoms. Furthermore, we investigate the PES topology for medium-sized chainlike molecules (e.g., small peptides) by exploring thousands of minima. Remarkably, SO3krates demonstrates the ability to strike a balance between the conflicting demands of stability and the emergence of new minimum-energy conformations beyond the training data, which is crucial for realistic exploration tasks in the field of biochemistry.

Suggested Citation

  • J. Thorben Frank & Oliver T. Unke & Klaus-Robert Müller & Stefan Chmiela, 2024. "A Euclidean transformer for fast and stable machine learned force fields," Nature Communications, Nature, vol. 15(1), pages 1-16, December.
  • Handle: RePEc:nat:natcom:v:15:y:2024:i:1:d:10.1038_s41467-024-50620-6
    DOI: 10.1038/s41467-024-50620-6
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-024-50620-6
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-024-50620-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Kristof T. Schütt & Farhad Arbabzadah & Stefan Chmiela & Klaus R. Müller & Alexandre Tkatchenko, 2017. "Quantum-chemical insights from deep tensor neural networks," Nature Communications, Nature, vol. 8(1), pages 1-8, April.
    2. Albert Musaelian & Simon Batzner & Anders Johansson & Lixin Sun & Cameron J. Owen & Mordechai Kornbluth & Boris Kozinsky, 2023. "Learning local equivariant representations for large-scale atomistic dynamics," Nature Communications, Nature, vol. 14(1), pages 1-15, December.
    3. Tsz Wai Ko & Jonas A. Finkler & Stefan Goedecker & Jörg Behler, 2021. "A fourth-generation high-dimensional neural network potential with accurate electrostatics including non-local charge transfer," Nature Communications, Nature, vol. 12(1), pages 1-11, December.
    4. Stefan Chmiela & Huziel E. Sauceda & Klaus-Robert Müller & Alexandre Tkatchenko, 2018. "Towards exact molecular dynamics simulations with machine-learned force fields," Nature Communications, Nature, vol. 9(1), pages 1-10, December.
    5. Huziel E. Sauceda & Luis E. Gálvez-González & Stefan Chmiela & Lauro Oliver Paz-Borbón & Klaus-Robert Müller & Alexandre Tkatchenko, 2022. "BIGDML—Towards accurate quantum machine learning force fields for materials," Nature Communications, Nature, vol. 13(1), pages 1-16, December.
    6. Simon Batzner & Albert Musaelian & Lixin Sun & Mario Geiger & Jonathan P. Mailoa & Mordechai Kornbluth & Nicola Molinari & Tess E. Smidt & Boris Kozinsky, 2022. "E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    7. Yusong Wang & Tong Wang & Shaoning Li & Xinheng He & Mingyu Li & Zun Wang & Nanning Zheng & Bin Shao & Tie-Yan Liu, 2024. "Enhancing geometric representations for molecules with equivariant vector-scalar interactive message passing," Nature Communications, Nature, vol. 15(1), pages 1-13, December.
    8. Oliver T. Unke & Stefan Chmiela & Michael Gastegger & Kristof T. Schütt & Huziel E. Sauceda & Klaus-Robert Müller, 2021. "SpookyNet: Learning force fields with electronic degrees of freedom and nonlocal effects," Nature Communications, Nature, vol. 12(1), pages 1-14, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Adil Kabylda & Valentin Vassilev-Galindo & Stefan Chmiela & Igor Poltavsky & Alexandre Tkatchenko, 2023. "Efficient interatomic descriptors for accurate machine learning force fields of extended molecules," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    2. Yusong Wang & Tong Wang & Shaoning Li & Xinheng He & Mingyu Li & Zun Wang & Nanning Zheng & Bin Shao & Tie-Yan Liu, 2024. "Enhancing geometric representations for molecules with equivariant vector-scalar interactive message passing," Nature Communications, Nature, vol. 15(1), pages 1-13, December.
    3. Albert Musaelian & Simon Batzner & Anders Johansson & Lixin Sun & Cameron J. Owen & Mordechai Kornbluth & Boris Kozinsky, 2023. "Learning local equivariant representations for large-scale atomistic dynamics," Nature Communications, Nature, vol. 14(1), pages 1-15, December.
    4. Charlotte Loh & Thomas Christensen & Rumen Dangovski & Samuel Kim & Marin Soljačić, 2022. "Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    5. Huziel E. Sauceda & Luis E. Gálvez-González & Stefan Chmiela & Lauro Oliver Paz-Borbón & Klaus-Robert Müller & Alexandre Tkatchenko, 2022. "BIGDML—Towards accurate quantum machine learning force fields for materials," Nature Communications, Nature, vol. 13(1), pages 1-16, December.
    6. Yaolong Zhang & Bin Jiang, 2023. "Universal machine learning for the response of atomistic systems to external fields," Nature Communications, Nature, vol. 14(1), pages 1-11, December.
    7. Yuanming Bai & Leslie Vogt-Maranto & Mark E. Tuckerman & William J. Glover, 2022. "Machine learning the Hohenberg-Kohn map for molecular excited states," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    8. Alessio Fallani & Leonardo Medrano Sandonas & Alexandre Tkatchenko, 2024. "Inverse mapping of quantum properties to structures for chemical space of small organic molecules," Nature Communications, Nature, vol. 15(1), pages 1-14, December.
    9. Stephan Thaler & Julija Zavadlav, 2021. "Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting," Nature Communications, Nature, vol. 12(1), pages 1-10, December.
    10. Simon Batzner & Albert Musaelian & Lixin Sun & Mario Geiger & Jonathan P. Mailoa & Mordechai Kornbluth & Nicola Molinari & Tess E. Smidt & Boris Kozinsky, 2022. "E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    11. Oliver T. Unke & Stefan Chmiela & Michael Gastegger & Kristof T. Schütt & Huziel E. Sauceda & Klaus-Robert Müller, 2021. "SpookyNet: Learning force fields with electronic degrees of freedom and nonlocal effects," Nature Communications, Nature, vol. 12(1), pages 1-14, December.
    12. Hanwen Zhang & Veronika Juraskova & Fernanda Duarte, 2024. "Modelling chemical processes in explicit solvents with machine learning potentials," Nature Communications, Nature, vol. 15(1), pages 1-11, December.
    13. Andreas Erlebach & Martin Šípka & Indranil Saha & Petr Nachtigall & Christopher J. Heard & Lukáš Grajciar, 2024. "A reactive neural network framework for water-loaded acidic zeolites," Nature Communications, Nature, vol. 15(1), pages 1-14, December.
    14. Peikun Zheng & Roman Zubatyuk & Wei Wu & Olexandr Isayev & Pavlo O. Dral, 2021. "Artificial intelligence-enhanced quantum chemical method with broad applicability," Nature Communications, Nature, vol. 12(1), pages 1-13, December.
    15. Niklas W. A. Gebauer & Michael Gastegger & Stefaan S. P. Hessmann & Klaus-Robert Müller & Kristof T. Schütt, 2022. "Inverse design of 3d molecular structures with conditional generative neural networks," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    16. Li Zheng & Konstantinos Karapiperis & Siddhant Kumar & Dennis M. Kochmann, 2023. "Unifying the design space and optimizing linear and nonlinear truss metamaterials by generative modeling," Nature Communications, Nature, vol. 14(1), pages 1-14, December.
    17. Nikita Moshkov & Tim Becker & Kevin Yang & Peter Horvath & Vlado Dancik & Bridget K. Wagner & Paul A. Clemons & Shantanu Singh & Anne E. Carpenter & Juan C. Caicedo, 2023. "Predicting compound activity from phenotypic profiles and chemical structures," Nature Communications, Nature, vol. 14(1), pages 1-11, December.
    18. Xing Chen & Flavio Abreu Araujo & Mathieu Riou & Jacob Torrejon & Dafiné Ravelosona & Wang Kang & Weisheng Zhao & Julie Grollier & Damien Querlioz, 2022. "Forecasting the outcome of spintronic experiments with Neural Ordinary Differential Equations," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    19. David Buterez & Jon Paul Janet & Steven J. Kiddle & Dino Oglic & Pietro Lió, 2024. "Transfer learning with graph neural networks for improved molecular property prediction in the multi-fidelity setting," Nature Communications, Nature, vol. 15(1), pages 1-18, December.
    20. Linus C. Erhard & Jochen Rohrer & Karsten Albe & Volker L. Deringer, 2024. "Modelling atomic and nanoscale structure in the silicon–oxygen system through active machine learning," Nature Communications, Nature, vol. 15(1), pages 1-12, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:15:y:2024:i:1:d:10.1038_s41467-024-50620-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.