IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v12y2021i1d10.1038_s41467-021-27504-0.html
   My bibliography  Save this article

SpookyNet: Learning force fields with electronic degrees of freedom and nonlocal effects

Author

Listed:
  • Oliver T. Unke

    (Technische Universität Berlin
    Technische Universität Berlin)

  • Stefan Chmiela

    (Technische Universität Berlin)

  • Michael Gastegger

    (Technische Universität Berlin
    Technische Universität Berlin)

  • Kristof T. Schütt

    (Technische Universität Berlin)

  • Huziel E. Sauceda

    (Technische Universität Berlin
    Technische Universität Berlin)

  • Klaus-Robert Müller

    (Technische Universität Berlin
    Korea University, Anam-dong, Seongbuk-gu
    Max Planck Institute for Informatics, Stuhlsatzenhausweg
    BIFOLD—Berlin Institute for the Foundations of Learning and Data)

Abstract

Machine-learned force fields combine the accuracy of ab initio methods with the efficiency of conventional force fields. However, current machine-learned force fields typically ignore electronic degrees of freedom, such as the total charge or spin state, and assume chemical locality, which is problematic when molecules have inconsistent electronic states, or when nonlocal effects play a significant role. This work introduces SpookyNet, a deep neural network for constructing machine-learned force fields with explicit treatment of electronic degrees of freedom and nonlocality, modeled via self-attention in a transformer architecture. Chemically meaningful inductive biases and analytical corrections built into the network architecture allow it to properly model physical limits. SpookyNet improves upon the current state-of-the-art (or achieves similar performance) on popular quantum chemistry data sets. Notably, it is able to generalize across chemical and conformational space and can leverage the learned chemical insights, e.g. by predicting unknown spin states, thus helping to close a further important remaining gap for today’s machine learning models in quantum chemistry.

Suggested Citation

  • Oliver T. Unke & Stefan Chmiela & Michael Gastegger & Kristof T. Schütt & Huziel E. Sauceda & Klaus-Robert Müller, 2021. "SpookyNet: Learning force fields with electronic degrees of freedom and nonlocal effects," Nature Communications, Nature, vol. 12(1), pages 1-14, December.
  • Handle: RePEc:nat:natcom:v:12:y:2021:i:1:d:10.1038_s41467-021-27504-0
    DOI: 10.1038/s41467-021-27504-0
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-021-27504-0
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-021-27504-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Kristof T. Schütt & Farhad Arbabzadah & Stefan Chmiela & Klaus R. Müller & Alexandre Tkatchenko, 2017. "Quantum-chemical insights from deep tensor neural networks," Nature Communications, Nature, vol. 8(1), pages 1-8, April.
    2. Volker L. Deringer & Miguel A. Caro & Gábor Csányi, 2020. "A general-purpose machine-learning force field for bulk and nanostructured phosphorus," Nature Communications, Nature, vol. 11(1), pages 1-11, December.
    3. K. T. Schütt & M. Gastegger & A. Tkatchenko & K.-R. Müller & R. J. Maurer, 2019. "Unifying machine learning and quantum chemistry with a deep neural network for molecular wavefunctions," Nature Communications, Nature, vol. 10(1), pages 1-10, December.
    4. Sebastian Lapuschkin & Stephan Wäldchen & Alexander Binder & Grégoire Montavon & Wojciech Samek & Klaus-Robert Müller, 2019. "Unmasking Clever Hans predictors and assessing what machines really learn," Nature Communications, Nature, vol. 10(1), pages 1-8, December.
    5. Tsz Wai Ko & Jonas A. Finkler & Stefan Goedecker & Jörg Behler, 2021. "A fourth-generation high-dimensional neural network potential with accurate electrostatics including non-local charge transfer," Nature Communications, Nature, vol. 12(1), pages 1-11, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. J. Thorben Frank & Oliver T. Unke & Klaus-Robert Müller & Stefan Chmiela, 2024. "A Euclidean transformer for fast and stable machine learned force fields," Nature Communications, Nature, vol. 15(1), pages 1-16, December.
    2. Zechen Tang & He Li & Peize Lin & Xiaoxun Gong & Gan Jin & Lixin He & Hong Jiang & Xinguo Ren & Wenhui Duan & Yong Xu, 2024. "A deep equivariant neural network approach for efficient hybrid density functional calculations," Nature Communications, Nature, vol. 15(1), pages 1-9, December.
    3. Kit Joll & Philipp Schienbein & Kevin M. Rosso & Jochen Blumberger, 2024. "Machine learning the electric field response of condensed phase systems using perturbed neural network potentials," Nature Communications, Nature, vol. 15(1), pages 1-12, December.
    4. Ziduo Yang & Yi-Ming Zhao & Xian Wang & Xiaoqing Liu & Xiuying Zhang & Yifan Li & Qiujie Lv & Calvin Yu-Chian Chen & Lei Shen, 2024. "Scalable crystal structure relaxation using an iteration-free deep generative model with uncertainty quantification," Nature Communications, Nature, vol. 15(1), pages 1-15, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. J. Thorben Frank & Oliver T. Unke & Klaus-Robert Müller & Stefan Chmiela, 2024. "A Euclidean transformer for fast and stable machine learned force fields," Nature Communications, Nature, vol. 15(1), pages 1-16, December.
    2. Yuanming Bai & Leslie Vogt-Maranto & Mark E. Tuckerman & William J. Glover, 2022. "Machine learning the Hohenberg-Kohn map for molecular excited states," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    3. Stephan Thaler & Julija Zavadlav, 2021. "Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting," Nature Communications, Nature, vol. 12(1), pages 1-10, December.
    4. Tobias Thomas & Dominik Straub & Fabian Tatai & Megan Shene & Tümer Tosik & Kristian Kersting & Constantin A. Rothkopf, 2024. "Modelling dataset bias in machine-learned theories of economic decision-making," Nature Human Behaviour, Nature, vol. 8(4), pages 679-691, April.
    5. Laura Lewis & Hsin-Yuan Huang & Viet T. Tran & Sebastian Lehner & Richard Kueng & John Preskill, 2024. "Improved machine learning algorithm for predicting ground state properties," Nature Communications, Nature, vol. 15(1), pages 1-8, December.
    6. Van Den Hauwe, Ludwig, 2023. "Why Machines Will Not Replace Entrepreneurs. On the Inevitable Limitations of Artificial Intelligence in Economic Life," MPRA Paper 118307, University Library of Munich, Germany.
    7. Peikun Zheng & Roman Zubatyuk & Wei Wu & Olexandr Isayev & Pavlo O. Dral, 2021. "Artificial intelligence-enhanced quantum chemical method with broad applicability," Nature Communications, Nature, vol. 12(1), pages 1-13, December.
    8. Nikita Moshkov & Tim Becker & Kevin Yang & Peter Horvath & Vlado Dancik & Bridget K. Wagner & Paul A. Clemons & Shantanu Singh & Anne E. Carpenter & Juan C. Caicedo, 2023. "Predicting compound activity from phenotypic profiles and chemical structures," Nature Communications, Nature, vol. 14(1), pages 1-11, December.
    9. Amin Alibakhshi & Bernd Hartke, 2022. "Implicitly perturbed Hamiltonian as a class of versatile and general-purpose molecular representations for machine learning," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    10. Martin Obschonka & David B. Audretsch, 2020. "Artificial intelligence and big data in entrepreneurship: a new era has begun," Small Business Economics, Springer, vol. 55(3), pages 529-539, October.
    11. Jerome Friedman & Trevor Hastie & Robert Tibshirani, 2020. "Discussion of “Prediction, Estimation, and Attribution” by Bradley Efron," International Statistical Review, International Statistical Institute, vol. 88(S1), pages 73-74, December.
    12. Huziel E. Sauceda & Luis E. Gálvez-González & Stefan Chmiela & Lauro Oliver Paz-Borbón & Klaus-Robert Müller & Alexandre Tkatchenko, 2022. "BIGDML—Towards accurate quantum machine learning force fields for materials," Nature Communications, Nature, vol. 13(1), pages 1-16, December.
    13. Xing Chen & Flavio Abreu Araujo & Mathieu Riou & Jacob Torrejon & Dafiné Ravelosona & Wang Kang & Weisheng Zhao & Julie Grollier & Damien Querlioz, 2022. "Forecasting the outcome of spintronic experiments with Neural Ordinary Differential Equations," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    14. Xun Li & Dongsheng Chen & Weipan Xu & Haohui Chen & Junjun Li & Fan Mo, 2023. "Explainable dimensionality reduction (XDR) to unbox AI ‘black box’ models: A study of AI perspectives on the ethnic styles of village dwellings," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-13, December.
    15. March, Christoph, 2021. "Strategic interactions between humans and artificial intelligence: Lessons from experiments with computer players," Journal of Economic Psychology, Elsevier, vol. 87(C).
    16. Kit Joll & Philipp Schienbein & Kevin M. Rosso & Jochen Blumberger, 2024. "Machine learning the electric field response of condensed phase systems using perturbed neural network potentials," Nature Communications, Nature, vol. 15(1), pages 1-12, December.
    17. Linus C. Erhard & Jochen Rohrer & Karsten Albe & Volker L. Deringer, 2024. "Modelling atomic and nanoscale structure in the silicon–oxygen system through active machine learning," Nature Communications, Nature, vol. 15(1), pages 1-12, December.
    18. Young Jae Kim & Jung-Im Na & Seung Seog Han & Chong Hyun Won & Mi Woo Lee & Jung-Won Shin & Chang-Hun Huh & Sung Eun Chang, 2022. "Augmenting the accuracy of trainee doctors in diagnosing skin lesions suspected of skin neoplasms in a real-world setting: A prospective controlled before-and-after study," PLOS ONE, Public Library of Science, vol. 17(1), pages 1-11, January.
    19. Adil Kabylda & Valentin Vassilev-Galindo & Stefan Chmiela & Igor Poltavsky & Alexandre Tkatchenko, 2023. "Efficient interatomic descriptors for accurate machine learning force fields of extended molecules," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    20. Ang Gao & Richard C. Remsing, 2022. "Self-consistent determination of long-range electrostatics in neural network potentials," Nature Communications, Nature, vol. 13(1), pages 1-11, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:12:y:2021:i:1:d:10.1038_s41467-021-27504-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.