IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v12y2021i1d10.1038_s41467-020-20427-2.html
   My bibliography  Save this article

A fourth-generation high-dimensional neural network potential with accurate electrostatics including non-local charge transfer

Author

Listed:
  • Tsz Wai Ko

    (Universität Göttingen, Institut für Physikalische Chemie, Theoretische Chemie)

  • Jonas A. Finkler

    (Department of Physics, Universität Basel)

  • Stefan Goedecker

    (Department of Physics, Universität Basel)

  • Jörg Behler

    (Universität Göttingen, Institut für Physikalische Chemie, Theoretische Chemie)

Abstract

Machine learning potentials have become an important tool for atomistic simulations in many fields, from chemistry via molecular biology to materials science. Most of the established methods, however, rely on local properties and are thus unable to take global changes in the electronic structure into account, which result from long-range charge transfer or different charge states. In this work we overcome this limitation by introducing a fourth-generation high-dimensional neural network potential that combines a charge equilibration scheme employing environment-dependent atomic electronegativities with accurate atomic energies. The method, which is able to correctly describe global charge distributions in arbitrary systems, yields much improved energies and substantially extends the applicability of modern machine learning potentials. This is demonstrated for a series of systems representing typical scenarios in chemistry and materials science that are incorrectly described by current methods, while the fourth-generation neural network potential is in excellent agreement with electronic structure calculations.

Suggested Citation

  • Tsz Wai Ko & Jonas A. Finkler & Stefan Goedecker & Jörg Behler, 2021. "A fourth-generation high-dimensional neural network potential with accurate electrostatics including non-local charge transfer," Nature Communications, Nature, vol. 12(1), pages 1-11, December.
  • Handle: RePEc:nat:natcom:v:12:y:2021:i:1:d:10.1038_s41467-020-20427-2
    DOI: 10.1038/s41467-020-20427-2
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-020-20427-2
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-020-20427-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. J. Thorben Frank & Oliver T. Unke & Klaus-Robert Müller & Stefan Chmiela, 2024. "A Euclidean transformer for fast and stable machine learned force fields," Nature Communications, Nature, vol. 15(1), pages 1-16, December.
    2. Huziel E. Sauceda & Luis E. Gálvez-González & Stefan Chmiela & Lauro Oliver Paz-Borbón & Klaus-Robert Müller & Alexandre Tkatchenko, 2022. "BIGDML—Towards accurate quantum machine learning force fields for materials," Nature Communications, Nature, vol. 13(1), pages 1-16, December.
    3. Peikun Zheng & Roman Zubatyuk & Wei Wu & Olexandr Isayev & Pavlo O. Dral, 2021. "Artificial intelligence-enhanced quantum chemical method with broad applicability," Nature Communications, Nature, vol. 12(1), pages 1-13, December.
    4. Adil Kabylda & Valentin Vassilev-Galindo & Stefan Chmiela & Igor Poltavsky & Alexandre Tkatchenko, 2023. "Efficient interatomic descriptors for accurate machine learning force fields of extended molecules," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    5. Ang Gao & Richard C. Remsing, 2022. "Self-consistent determination of long-range electrostatics in neural network potentials," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    6. Oliver T. Unke & Stefan Chmiela & Michael Gastegger & Kristof T. Schütt & Huziel E. Sauceda & Klaus-Robert Müller, 2021. "SpookyNet: Learning force fields with electronic degrees of freedom and nonlocal effects," Nature Communications, Nature, vol. 12(1), pages 1-14, December.
    7. Stephan Thaler & Julija Zavadlav, 2021. "Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting," Nature Communications, Nature, vol. 12(1), pages 1-10, December.
    8. Linus C. Erhard & Jochen Rohrer & Karsten Albe & Volker L. Deringer, 2024. "Modelling atomic and nanoscale structure in the silicon–oxygen system through active machine learning," Nature Communications, Nature, vol. 15(1), pages 1-12, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:12:y:2021:i:1:d:10.1038_s41467-020-20427-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.