IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v12y2021i1d10.1038_s41467-021-27504-0.html
   My bibliography  Save this article

SpookyNet: Learning force fields with electronic degrees of freedom and nonlocal effects

Author

Listed:
  • Oliver T. Unke

    (Technische Universität Berlin
    Technische Universität Berlin)

  • Stefan Chmiela

    (Technische Universität Berlin)

  • Michael Gastegger

    (Technische Universität Berlin
    Technische Universität Berlin)

  • Kristof T. Schütt

    (Technische Universität Berlin)

  • Huziel E. Sauceda

    (Technische Universität Berlin
    Technische Universität Berlin)

  • Klaus-Robert Müller

    (Technische Universität Berlin
    Korea University, Anam-dong, Seongbuk-gu
    Max Planck Institute for Informatics, Stuhlsatzenhausweg
    BIFOLD—Berlin Institute for the Foundations of Learning and Data)

Abstract

Machine-learned force fields combine the accuracy of ab initio methods with the efficiency of conventional force fields. However, current machine-learned force fields typically ignore electronic degrees of freedom, such as the total charge or spin state, and assume chemical locality, which is problematic when molecules have inconsistent electronic states, or when nonlocal effects play a significant role. This work introduces SpookyNet, a deep neural network for constructing machine-learned force fields with explicit treatment of electronic degrees of freedom and nonlocality, modeled via self-attention in a transformer architecture. Chemically meaningful inductive biases and analytical corrections built into the network architecture allow it to properly model physical limits. SpookyNet improves upon the current state-of-the-art (or achieves similar performance) on popular quantum chemistry data sets. Notably, it is able to generalize across chemical and conformational space and can leverage the learned chemical insights, e.g. by predicting unknown spin states, thus helping to close a further important remaining gap for today’s machine learning models in quantum chemistry.

Suggested Citation

  • Oliver T. Unke & Stefan Chmiela & Michael Gastegger & Kristof T. Schütt & Huziel E. Sauceda & Klaus-Robert Müller, 2021. "SpookyNet: Learning force fields with electronic degrees of freedom and nonlocal effects," Nature Communications, Nature, vol. 12(1), pages 1-14, December.
  • Handle: RePEc:nat:natcom:v:12:y:2021:i:1:d:10.1038_s41467-021-27504-0
    DOI: 10.1038/s41467-021-27504-0
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-021-27504-0
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-021-27504-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Kristof T. Schütt & Farhad Arbabzadah & Stefan Chmiela & Klaus R. Müller & Alexandre Tkatchenko, 2017. "Quantum-chemical insights from deep tensor neural networks," Nature Communications, Nature, vol. 8(1), pages 1-8, April.
    2. Tsz Wai Ko & Jonas A. Finkler & Stefan Goedecker & Jörg Behler, 2021. "A fourth-generation high-dimensional neural network potential with accurate electrostatics including non-local charge transfer," Nature Communications, Nature, vol. 12(1), pages 1-11, December.
    3. Sebastian Lapuschkin & Stephan Wäldchen & Alexander Binder & Grégoire Montavon & Wojciech Samek & Klaus-Robert Müller, 2019. "Unmasking Clever Hans predictors and assessing what machines really learn," Nature Communications, Nature, vol. 10(1), pages 1-8, December.
    4. Volker L. Deringer & Miguel A. Caro & Gábor Csányi, 2020. "A general-purpose machine-learning force field for bulk and nanostructured phosphorus," Nature Communications, Nature, vol. 11(1), pages 1-11, December.
    5. K. T. Schütt & M. Gastegger & A. Tkatchenko & K.-R. Müller & R. J. Maurer, 2019. "Unifying machine learning and quantum chemistry with a deep neural network for molecular wavefunctions," Nature Communications, Nature, vol. 10(1), pages 1-10, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Stephan Thaler & Julija Zavadlav, 2021. "Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting," Nature Communications, Nature, vol. 12(1), pages 1-10, December.
    2. Yuanming Bai & Leslie Vogt-Maranto & Mark E. Tuckerman & William J. Glover, 2022. "Machine learning the Hohenberg-Kohn map for molecular excited states," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    3. Laura Lewis & Hsin-Yuan Huang & Viet T. Tran & Sebastian Lehner & Richard Kueng & John Preskill, 2024. "Improved machine learning algorithm for predicting ground state properties," Nature Communications, Nature, vol. 15(1), pages 1-8, December.
    4. Van Den Hauwe, Ludwig, 2023. "Why Machines Will Not Replace Entrepreneurs. On the Inevitable Limitations of Artificial Intelligence in Economic Life," MPRA Paper 118307, University Library of Munich, Germany.
    5. Peikun Zheng & Roman Zubatyuk & Wei Wu & Olexandr Isayev & Pavlo O. Dral, 2021. "Artificial intelligence-enhanced quantum chemical method with broad applicability," Nature Communications, Nature, vol. 12(1), pages 1-13, December.
    6. Nikita Moshkov & Tim Becker & Kevin Yang & Peter Horvath & Vlado Dancik & Bridget K. Wagner & Paul A. Clemons & Shantanu Singh & Anne E. Carpenter & Juan C. Caicedo, 2023. "Predicting compound activity from phenotypic profiles and chemical structures," Nature Communications, Nature, vol. 14(1), pages 1-11, December.
    7. Amin Alibakhshi & Bernd Hartke, 2022. "Implicitly perturbed Hamiltonian as a class of versatile and general-purpose molecular representations for machine learning," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    8. Huziel E. Sauceda & Luis E. Gálvez-González & Stefan Chmiela & Lauro Oliver Paz-Borbón & Klaus-Robert Müller & Alexandre Tkatchenko, 2022. "BIGDML—Towards accurate quantum machine learning force fields for materials," Nature Communications, Nature, vol. 13(1), pages 1-16, December.
    9. Xing Chen & Flavio Abreu Araujo & Mathieu Riou & Jacob Torrejon & Dafiné Ravelosona & Wang Kang & Weisheng Zhao & Julie Grollier & Damien Querlioz, 2022. "Forecasting the outcome of spintronic experiments with Neural Ordinary Differential Equations," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    10. Xun Li & Dongsheng Chen & Weipan Xu & Haohui Chen & Junjun Li & Fan Mo, 2023. "Explainable dimensionality reduction (XDR) to unbox AI ‘black box’ models: A study of AI perspectives on the ethnic styles of village dwellings," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-13, December.
    11. Young Jae Kim & Jung-Im Na & Seung Seog Han & Chong Hyun Won & Mi Woo Lee & Jung-Won Shin & Chang-Hun Huh & Sung Eun Chang, 2022. "Augmenting the accuracy of trainee doctors in diagnosing skin lesions suspected of skin neoplasms in a real-world setting: A prospective controlled before-and-after study," PLOS ONE, Public Library of Science, vol. 17(1), pages 1-11, January.
    12. Xiao Tan & Yuan Zhou & Zuohua Ding & Yang Liu, 2021. "Selecting Correct Methods to Extract Fuzzy Rules from Artificial Neural Network," Mathematics, MDPI, vol. 9(11), pages 1-22, May.
    13. Yusong Wang & Tong Wang & Shaoning Li & Xinheng He & Mingyu Li & Zun Wang & Nanning Zheng & Bin Shao & Tie-Yan Liu, 2024. "Enhancing geometric representations for molecules with equivariant vector-scalar interactive message passing," Nature Communications, Nature, vol. 15(1), pages 1-13, December.
    14. Simon Batzner & Albert Musaelian & Lixin Sun & Mario Geiger & Jonathan P. Mailoa & Mordechai Kornbluth & Nicola Molinari & Tess E. Smidt & Boris Kozinsky, 2022. "E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    15. Shane Fox & James McDermott & Edelle Doherty & Ronan Cooney & Eoghan Clifford, 2022. "Application of Neural Networks and Regression Modelling to Enable Environmental Regulatory Compliance and Energy Optimisation in a Sequencing Batch Reactor," Sustainability, MDPI, vol. 14(7), pages 1-28, March.
    16. Yi Yang & Xinwei Li & Huamin Li & Dongyin Li & Ruifu Yuan, 2020. "Deep Q-Network for Optimal Decision for Top-Coal Caving," Energies, MDPI, vol. 13(7), pages 1-14, April.
    17. Verhagen, Mark D., 2021. "Identifying and Improving Functional Form Complexity: A Machine Learning Framework," SocArXiv bka76, Center for Open Science.
    18. March, Christoph, 2019. "The behavioral economics of artificial intelligence: Lessons from experiments with computer players," BERG Working Paper Series 154, Bamberg University, Bamberg Economic Research Group.
    19. Charlotte Loh & Thomas Christensen & Rumen Dangovski & Samuel Kim & Marin Soljačić, 2022. "Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    20. Minji Lee & Leandro R. D. Sanz & Alice Barra & Audrey Wolff & Jaakko O. Nieminen & Melanie Boly & Mario Rosanova & Silvia Casarotto & Olivier Bodart & Jitka Annen & Aurore Thibaut & Rajanikant Panda &, 2022. "Quantifying arousal and awareness in altered states of consciousness using interpretable deep learning," Nature Communications, Nature, vol. 13(1), pages 1-14, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:12:y:2021:i:1:d:10.1038_s41467-021-27504-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.