IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v13y2022i1d10.1038_s41467-022-34436-w.html
   My bibliography  Save this article

Machine learning the Hohenberg-Kohn map for molecular excited states

Author

Listed:
  • Yuanming Bai

    (NYU Shanghai
    NYU-ECNU Center for Computational Chemistry at NYU Shanghai
    New York University)

  • Leslie Vogt-Maranto

    (New York University)

  • Mark E. Tuckerman

    (NYU-ECNU Center for Computational Chemistry at NYU Shanghai
    New York University
    Simons Center for Computational Physical Chemistry at New York University
    Courant Institute of Mathematical Science, New York University)

  • William J. Glover

    (NYU Shanghai
    NYU-ECNU Center for Computational Chemistry at NYU Shanghai
    New York University)

Abstract

The Hohenberg-Kohn theorem of density-functional theory establishes the existence of a bijection between the ground-state electron density and the external potential of a many-body system. This guarantees a one-to-one map from the electron density to all observables of interest including electronic excited-state energies. Time-Dependent Density-Functional Theory (TDDFT) provides one framework to resolve this map; however, the approximations inherent in practical TDDFT calculations, together with their computational expense, motivate finding a cheaper, more direct map for electronic excitations. Here, we show that determining density and energy functionals via machine learning allows the equations of TDDFT to be bypassed. The framework we introduce is used to perform the first excited-state molecular dynamics simulations with a machine-learned functional on malonaldehyde and correctly capture the kinetics of its excited-state intramolecular proton transfer, allowing insight into how mechanical constraints can be used to control the proton transfer reaction in this molecule. This development opens the door to using machine-learned functionals for highly efficient excited-state dynamics simulations.

Suggested Citation

  • Yuanming Bai & Leslie Vogt-Maranto & Mark E. Tuckerman & William J. Glover, 2022. "Machine learning the Hohenberg-Kohn map for molecular excited states," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
  • Handle: RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-022-34436-w
    DOI: 10.1038/s41467-022-34436-w
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-022-34436-w
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-022-34436-w?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Kristof T. Schütt & Farhad Arbabzadah & Stefan Chmiela & Klaus R. Müller & Alexandre Tkatchenko, 2017. "Quantum-chemical insights from deep tensor neural networks," Nature Communications, Nature, vol. 8(1), pages 1-8, April.
    2. Mihail Bogojeski & Leslie Vogt-Maranto & Mark E. Tuckerman & Klaus-Robert Müller & Kieron Burke, 2020. "Quantum chemical accuracy from density functional approximations via machine learning," Nature Communications, Nature, vol. 11(1), pages 1-11, December.
    3. K. T. Schütt & M. Gastegger & A. Tkatchenko & K.-R. Müller & R. J. Maurer, 2019. "Unifying machine learning and quantum chemistry with a deep neural network for molecular wavefunctions," Nature Communications, Nature, vol. 10(1), pages 1-10, December.
    4. Sebastian Dick & Marivi Fernandez-Serra, 2020. "Machine learning accurate exchange and correlation functionals of the electronic density," Nature Communications, Nature, vol. 11(1), pages 1-10, December.
    5. Felix Brockherde & Leslie Vogt & Li Li & Mark E. Tuckerman & Kieron Burke & Klaus-Robert Müller, 2017. "Bypassing the Kohn-Sham equations with machine learning," Nature Communications, Nature, vol. 8(1), pages 1-10, December.
    6. Stefan Chmiela & Huziel E. Sauceda & Klaus-Robert Müller & Alexandre Tkatchenko, 2018. "Towards exact molecular dynamics simulations with machine-learned force fields," Nature Communications, Nature, vol. 9(1), pages 1-10, December.
    7. Md. Wahadoszamen & Iris Margalit & Anjue Mane Ara & Rienk van Grondelle & Dror Noy, 2014. "The role of charge-transfer states in energy transfer and dissipation within natural and artificial bacteriochlorophyll proteins," Nature Communications, Nature, vol. 5(1), pages 1-8, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. J. Thorben Frank & Oliver T. Unke & Klaus-Robert Müller & Stefan Chmiela, 2024. "A Euclidean transformer for fast and stable machine learned force fields," Nature Communications, Nature, vol. 15(1), pages 1-16, December.
    2. Yusong Wang & Tong Wang & Shaoning Li & Xinheng He & Mingyu Li & Zun Wang & Nanning Zheng & Bin Shao & Tie-Yan Liu, 2024. "Enhancing geometric representations for molecules with equivariant vector-scalar interactive message passing," Nature Communications, Nature, vol. 15(1), pages 1-13, December.
    3. Simon Batzner & Albert Musaelian & Lixin Sun & Mario Geiger & Jonathan P. Mailoa & Mordechai Kornbluth & Nicola Molinari & Tess E. Smidt & Boris Kozinsky, 2022. "E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    4. Albert Musaelian & Simon Batzner & Anders Johansson & Lixin Sun & Cameron J. Owen & Mordechai Kornbluth & Boris Kozinsky, 2023. "Learning local equivariant representations for large-scale atomistic dynamics," Nature Communications, Nature, vol. 14(1), pages 1-15, December.
    5. Oliver T. Unke & Stefan Chmiela & Michael Gastegger & Kristof T. Schütt & Huziel E. Sauceda & Klaus-Robert Müller, 2021. "SpookyNet: Learning force fields with electronic degrees of freedom and nonlocal effects," Nature Communications, Nature, vol. 12(1), pages 1-14, December.
    6. Charlotte Loh & Thomas Christensen & Rumen Dangovski & Samuel Kim & Marin Soljačić, 2022. "Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    7. Laura Lewis & Hsin-Yuan Huang & Viet T. Tran & Sebastian Lehner & Richard Kueng & John Preskill, 2024. "Improved machine learning algorithm for predicting ground state properties," Nature Communications, Nature, vol. 15(1), pages 1-8, December.
    8. Zhou, Yuekuan, 2024. "AI-driven battery ageing prediction with distributed renewable community and E-mobility energy sharing," Renewable Energy, Elsevier, vol. 225(C).
    9. Niklas W. A. Gebauer & Michael Gastegger & Stefaan S. P. Hessmann & Klaus-Robert Müller & Kristof T. Schütt, 2022. "Inverse design of 3d molecular structures with conditional generative neural networks," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    10. Nikita Moshkov & Tim Becker & Kevin Yang & Peter Horvath & Vlado Dancik & Bridget K. Wagner & Paul A. Clemons & Shantanu Singh & Anne E. Carpenter & Juan C. Caicedo, 2023. "Predicting compound activity from phenotypic profiles and chemical structures," Nature Communications, Nature, vol. 14(1), pages 1-11, December.
    11. Amin Alibakhshi & Bernd Hartke, 2022. "Implicitly perturbed Hamiltonian as a class of versatile and general-purpose molecular representations for machine learning," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    12. Eric C.-Y. Yuan & Anup Kumar & Xingyi Guan & Eric D. Hermes & Andrew S. Rosen & Judit Zádor & Teresa Head-Gordon & Samuel M. Blau, 2024. "Analytical ab initio hessian from a deep learning potential for transition state optimization," Nature Communications, Nature, vol. 15(1), pages 1-9, December.
    13. Huziel E. Sauceda & Luis E. Gálvez-González & Stefan Chmiela & Lauro Oliver Paz-Borbón & Klaus-Robert Müller & Alexandre Tkatchenko, 2022. "BIGDML—Towards accurate quantum machine learning force fields for materials," Nature Communications, Nature, vol. 13(1), pages 1-16, December.
    14. Xing Chen & Flavio Abreu Araujo & Mathieu Riou & Jacob Torrejon & Dafiné Ravelosona & Wang Kang & Weisheng Zhao & Julie Grollier & Damien Querlioz, 2022. "Forecasting the outcome of spintronic experiments with Neural Ordinary Differential Equations," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    15. David Buterez & Jon Paul Janet & Steven J. Kiddle & Dino Oglic & Pietro Lió, 2024. "Transfer learning with graph neural networks for improved molecular property prediction in the multi-fidelity setting," Nature Communications, Nature, vol. 15(1), pages 1-18, December.
    16. Liu, Yuanbin & Hong, Weixiang & Cao, Bingyang, 2019. "Machine learning for predicting thermodynamic properties of pure fluids and their mixtures," Energy, Elsevier, vol. 188(C).
    17. Adil Kabylda & Valentin Vassilev-Galindo & Stefan Chmiela & Igor Poltavsky & Alexandre Tkatchenko, 2023. "Efficient interatomic descriptors for accurate machine learning force fields of extended molecules," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    18. Xiao Tan & Yuan Zhou & Zuohua Ding & Yang Liu, 2021. "Selecting Correct Methods to Extract Fuzzy Rules from Artificial Neural Network," Mathematics, MDPI, vol. 9(11), pages 1-22, May.
    19. Alessio Fallani & Leonardo Medrano Sandonas & Alexandre Tkatchenko, 2024. "Inverse mapping of quantum properties to structures for chemical space of small organic molecules," Nature Communications, Nature, vol. 15(1), pages 1-14, December.
    20. Stephan Thaler & Julija Zavadlav, 2021. "Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting," Nature Communications, Nature, vol. 12(1), pages 1-10, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-022-34436-w. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.