IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v14y2023i1d10.1038_s41467-023-39214-w.html
   My bibliography  Save this article

Efficient interatomic descriptors for accurate machine learning force fields of extended molecules

Author

Listed:
  • Adil Kabylda

    (University of Luxembourg)

  • Valentin Vassilev-Galindo

    (University of Luxembourg)

  • Stefan Chmiela

    (Technische Universität Berlin
    BIFOLD – Berlin Institute for the Foundations of Learning and Data)

  • Igor Poltavsky

    (University of Luxembourg)

  • Alexandre Tkatchenko

    (University of Luxembourg)

Abstract

Machine learning force fields (MLFFs) are gradually evolving towards enabling molecular dynamics simulations of molecules and materials with ab initio accuracy but at a small fraction of the computational cost. However, several challenges remain to be addressed to enable predictive MLFF simulations of realistic molecules, including: (1) developing efficient descriptors for non-local interatomic interactions, which are essential to capture long-range molecular fluctuations, and (2) reducing the dimensionality of the descriptors to enhance the applicability and interpretability of MLFFs. Here we propose an automatized approach to substantially reduce the number of interatomic descriptor features while preserving the accuracy and increasing the efficiency of MLFFs. To simultaneously address the two stated challenges, we illustrate our approach on the example of the global GDML MLFF. We found that non-local features (atoms separated by as far as 15 Å in studied systems) are crucial to retain the overall accuracy of the MLFF for peptides, DNA base pairs, fatty acids, and supramolecular complexes. Interestingly, the number of required non-local features in the reduced descriptors becomes comparable to the number of local interatomic features (those below 5 Å). These results pave the way to constructing global molecular MLFFs whose cost increases linearly, instead of quadratically, with system size.

Suggested Citation

  • Adil Kabylda & Valentin Vassilev-Galindo & Stefan Chmiela & Igor Poltavsky & Alexandre Tkatchenko, 2023. "Efficient interatomic descriptors for accurate machine learning force fields of extended molecules," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
  • Handle: RePEc:nat:natcom:v:14:y:2023:i:1:d:10.1038_s41467-023-39214-w
    DOI: 10.1038/s41467-023-39214-w
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-023-39214-w
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-023-39214-w?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Ang Gao & Richard C. Remsing, 2022. "Self-consistent determination of long-range electrostatics in neural network potentials," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    2. Volker L. Deringer & Noam Bernstein & Gábor Csányi & Chiheb Mahmoud & Michele Ceriotti & Mark Wilson & David A. Drabold & Stephen R. Elliott, 2021. "Origins of structural and electronic transitions in disordered silicon," Nature, Nature, vol. 589(7840), pages 59-64, January.
    3. Justin S. Smith & Benjamin T. Nebgen & Roman Zubatyuk & Nicholas Lubbers & Christian Devereux & Kipton Barros & Sergei Tretiak & Olexandr Isayev & Adrian E. Roitberg, 2019. "Approaching coupled cluster accuracy with a general-purpose neural network potential through transfer learning," Nature Communications, Nature, vol. 10(1), pages 1-8, December.
    4. Huziel E. Sauceda & Luis E. Gálvez-González & Stefan Chmiela & Lauro Oliver Paz-Borbón & Klaus-Robert Müller & Alexandre Tkatchenko, 2022. "BIGDML—Towards accurate quantum machine learning force fields for materials," Nature Communications, Nature, vol. 13(1), pages 1-16, December.
    5. Simon Batzner & Albert Musaelian & Lixin Sun & Mario Geiger & Jonathan P. Mailoa & Mordechai Kornbluth & Nicola Molinari & Tess E. Smidt & Boris Kozinsky, 2022. "E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    6. Tsz Wai Ko & Jonas A. Finkler & Stefan Goedecker & Jörg Behler, 2021. "A fourth-generation high-dimensional neural network potential with accurate electrostatics including non-local charge transfer," Nature Communications, Nature, vol. 12(1), pages 1-11, December.
    7. Stefan Chmiela & Huziel E. Sauceda & Klaus-Robert Müller & Alexandre Tkatchenko, 2018. "Towards exact molecular dynamics simulations with machine-learned force fields," Nature Communications, Nature, vol. 9(1), pages 1-10, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. J. Thorben Frank & Oliver T. Unke & Klaus-Robert Müller & Stefan Chmiela, 2024. "A Euclidean transformer for fast and stable machine learned force fields," Nature Communications, Nature, vol. 15(1), pages 1-16, December.
    2. Huziel E. Sauceda & Luis E. Gálvez-González & Stefan Chmiela & Lauro Oliver Paz-Borbón & Klaus-Robert Müller & Alexandre Tkatchenko, 2022. "BIGDML—Towards accurate quantum machine learning force fields for materials," Nature Communications, Nature, vol. 13(1), pages 1-16, December.
    3. Yaolong Zhang & Bin Jiang, 2023. "Universal machine learning for the response of atomistic systems to external fields," Nature Communications, Nature, vol. 14(1), pages 1-11, December.
    4. Hanwen Zhang & Veronika Juraskova & Fernanda Duarte, 2024. "Modelling chemical processes in explicit solvents with machine learning potentials," Nature Communications, Nature, vol. 15(1), pages 1-11, December.
    5. Peikun Zheng & Roman Zubatyuk & Wei Wu & Olexandr Isayev & Pavlo O. Dral, 2021. "Artificial intelligence-enhanced quantum chemical method with broad applicability," Nature Communications, Nature, vol. 12(1), pages 1-13, December.
    6. David Buterez & Jon Paul Janet & Steven J. Kiddle & Dino Oglic & Pietro Lió, 2024. "Transfer learning with graph neural networks for improved molecular property prediction in the multi-fidelity setting," Nature Communications, Nature, vol. 15(1), pages 1-18, December.
    7. Kit Joll & Philipp Schienbein & Kevin M. Rosso & Jochen Blumberger, 2024. "Machine learning the electric field response of condensed phase systems using perturbed neural network potentials," Nature Communications, Nature, vol. 15(1), pages 1-12, December.
    8. Linus C. Erhard & Jochen Rohrer & Karsten Albe & Volker L. Deringer, 2024. "Modelling atomic and nanoscale structure in the silicon–oxygen system through active machine learning," Nature Communications, Nature, vol. 15(1), pages 1-12, December.
    9. Ang Gao & Richard C. Remsing, 2022. "Self-consistent determination of long-range electrostatics in neural network potentials," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    10. Junjie Wang & Yong Wang & Haoting Zhang & Ziyang Yang & Zhixin Liang & Jiuyang Shi & Hui-Tian Wang & Dingyu Xing & Jian Sun, 2024. "E(n)-Equivariant cartesian tensor message passing interatomic potential," Nature Communications, Nature, vol. 15(1), pages 1-9, December.
    11. Stephan Thaler & Julija Zavadlav, 2021. "Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting," Nature Communications, Nature, vol. 12(1), pages 1-10, December.
    12. Yusong Wang & Tong Wang & Shaoning Li & Xinheng He & Mingyu Li & Zun Wang & Nanning Zheng & Bin Shao & Tie-Yan Liu, 2024. "Enhancing geometric representations for molecules with equivariant vector-scalar interactive message passing," Nature Communications, Nature, vol. 15(1), pages 1-13, December.
    13. Albert Musaelian & Simon Batzner & Anders Johansson & Lixin Sun & Cameron J. Owen & Mordechai Kornbluth & Boris Kozinsky, 2023. "Learning local equivariant representations for large-scale atomistic dynamics," Nature Communications, Nature, vol. 14(1), pages 1-15, December.
    14. Charlotte Loh & Thomas Christensen & Rumen Dangovski & Samuel Kim & Marin Soljačić, 2022. "Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    15. Tian Xie & Arthur France-Lanord & Yanming Wang & Jeffrey Lopez & Michael A. Stolberg & Megan Hill & Graham Michael Leverick & Rafael Gomez-Bombarelli & Jeremiah A. Johnson & Yang Shao-Horn & Jeffrey C, 2022. "Accelerating amorphous polymer electrolyte screening by learning to reduce errors in molecular dynamics simulated properties," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    16. Andreas Erlebach & Martin Šípka & Indranil Saha & Petr Nachtigall & Christopher J. Heard & Lukáš Grajciar, 2024. "A reactive neural network framework for water-loaded acidic zeolites," Nature Communications, Nature, vol. 15(1), pages 1-14, December.
    17. Zhao Fan & Hajime Tanaka, 2024. "Microscopic mechanisms of pressure-induced amorphous-amorphous transitions and crystallisation in silicon," Nature Communications, Nature, vol. 15(1), pages 1-14, December.
    18. Shuai Jiang & Yi-Rong Liu & Teng Huang & Ya-Juan Feng & Chun-Yu Wang & Zhong-Quan Wang & Bin-Jing Ge & Quan-Sheng Liu & Wei-Ran Guang & Wei Huang, 2022. "Towards fully ab initio simulation of atmospheric aerosol nucleation," Nature Communications, Nature, vol. 13(1), pages 1-8, December.
    19. Niklas W. A. Gebauer & Michael Gastegger & Stefaan S. P. Hessmann & Klaus-Robert Müller & Kristof T. Schütt, 2022. "Inverse design of 3d molecular structures with conditional generative neural networks," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    20. Li Zheng & Konstantinos Karapiperis & Siddhant Kumar & Dennis M. Kochmann, 2023. "Unifying the design space and optimizing linear and nonlinear truss metamaterials by generative modeling," Nature Communications, Nature, vol. 14(1), pages 1-14, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:14:y:2023:i:1:d:10.1038_s41467-023-39214-w. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.