IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v15y2024i1d10.1038_s41467-024-45566-8.html
   My bibliography  Save this article

Transfer learning with graph neural networks for improved molecular property prediction in the multi-fidelity setting

Author

Listed:
  • David Buterez

    (University of Cambridge)

  • Jon Paul Janet

    (BioPharmaceuticals R&D, AstraZeneca)

  • Steven J. Kiddle

    (Data Science & AI, R&D, AstraZeneca)

  • Dino Oglic

    (BioPharmaceuticals R&D, AstraZeneca)

  • Pietro Lió

    (University of Cambridge)

Abstract

We investigate the potential of graph neural networks for transfer learning and improving molecular property prediction on sparse and expensive to acquire high-fidelity data by leveraging low-fidelity measurements as an inexpensive proxy for a targeted property of interest. This problem arises in discovery processes that rely on screening funnels for trading off the overall costs against throughput and accuracy. Typically, individual stages in these processes are loosely connected and each one generates data at different scale and fidelity. We consider this setup holistically and demonstrate empirically that existing transfer learning techniques for graph neural networks are generally unable to harness the information from multi-fidelity cascades. Here, we propose several effective transfer learning strategies and study them in transductive and inductive settings. Our analysis involves a collection of more than 28 million unique experimental protein-ligand interactions across 37 targets from drug discovery by high-throughput screening and 12 quantum properties from the dataset QMugs. The results indicate that transfer learning can improve the performance on sparse tasks by up to eight times while using an order of magnitude less high-fidelity training data. Moreover, the proposed methods consistently outperform existing transfer learning strategies for graph-structured data on drug discovery and quantum mechanics datasets.

Suggested Citation

  • David Buterez & Jon Paul Janet & Steven J. Kiddle & Dino Oglic & Pietro Lió, 2024. "Transfer learning with graph neural networks for improved molecular property prediction in the multi-fidelity setting," Nature Communications, Nature, vol. 15(1), pages 1-18, December.
  • Handle: RePEc:nat:natcom:v:15:y:2024:i:1:d:10.1038_s41467-024-45566-8
    DOI: 10.1038/s41467-024-45566-8
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-024-45566-8
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-024-45566-8?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Stefan Chmiela & Huziel E. Sauceda & Klaus-Robert Müller & Alexandre Tkatchenko, 2018. "Towards exact molecular dynamics simulations with machine-learned force fields," Nature Communications, Nature, vol. 9(1), pages 1-10, December.
    2. Amil Merchant & Simon Batzner & Samuel S. Schoenholz & Muratahan Aykol & Gowoon Cheon & Ekin Dogus Cubuk, 2023. "Scaling deep learning for materials discovery," Nature, Nature, vol. 624(7990), pages 80-85, December.
    3. Justin S. Smith & Benjamin T. Nebgen & Roman Zubatyuk & Nicholas Lubbers & Christian Devereux & Kipton Barros & Sergei Tretiak & Olexandr Isayev & Adrian E. Roitberg, 2019. "Approaching coupled cluster accuracy with a general-purpose neural network potential through transfer learning," Nature Communications, Nature, vol. 10(1), pages 1-8, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Adil Kabylda & Valentin Vassilev-Galindo & Stefan Chmiela & Igor Poltavsky & Alexandre Tkatchenko, 2023. "Efficient interatomic descriptors for accurate machine learning force fields of extended molecules," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    2. Tian Xie & Arthur France-Lanord & Yanming Wang & Jeffrey Lopez & Michael A. Stolberg & Megan Hill & Graham Michael Leverick & Rafael Gomez-Bombarelli & Jeremiah A. Johnson & Yang Shao-Horn & Jeffrey C, 2022. "Accelerating amorphous polymer electrolyte screening by learning to reduce errors in molecular dynamics simulated properties," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    3. Peikun Zheng & Roman Zubatyuk & Wei Wu & Olexandr Isayev & Pavlo O. Dral, 2021. "Artificial intelligence-enhanced quantum chemical method with broad applicability," Nature Communications, Nature, vol. 12(1), pages 1-13, December.
    4. Shuai Jiang & Yi-Rong Liu & Teng Huang & Ya-Juan Feng & Chun-Yu Wang & Zhong-Quan Wang & Bin-Jing Ge & Quan-Sheng Liu & Wei-Ran Guang & Wei Huang, 2022. "Towards fully ab initio simulation of atmospheric aerosol nucleation," Nature Communications, Nature, vol. 13(1), pages 1-8, December.
    5. Niklas W. A. Gebauer & Michael Gastegger & Stefaan S. P. Hessmann & Klaus-Robert Müller & Kristof T. Schütt, 2022. "Inverse design of 3d molecular structures with conditional generative neural networks," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    6. Gaétan de Rassenfosse & Adam B. Jaffe & Joel Waldfogel, 2024. "Intellectual Property and Creative Machines," NBER Chapters, in: Entrepreneurship and Innovation Policy and the Economy, volume 4, National Bureau of Economic Research, Inc.
    7. Huziel E. Sauceda & Luis E. Gálvez-González & Stefan Chmiela & Lauro Oliver Paz-Borbón & Klaus-Robert Müller & Alexandre Tkatchenko, 2022. "BIGDML—Towards accurate quantum machine learning force fields for materials," Nature Communications, Nature, vol. 13(1), pages 1-16, December.
    8. Zeyin Yan & Dacong Wei & Xin Li & Lung Wa Chung, 2024. "Accelerating reliable multiscale quantum refinement of protein–drug systems enabled by machine learning," Nature Communications, Nature, vol. 15(1), pages 1-12, December.
    9. Yuanming Bai & Leslie Vogt-Maranto & Mark E. Tuckerman & William J. Glover, 2022. "Machine learning the Hohenberg-Kohn map for molecular excited states," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    10. Stephan Thaler & Julija Zavadlav, 2021. "Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting," Nature Communications, Nature, vol. 12(1), pages 1-10, December.
    11. Yusong Wang & Tong Wang & Shaoning Li & Xinheng He & Mingyu Li & Zun Wang & Nanning Zheng & Bin Shao & Tie-Yan Liu, 2024. "Enhancing geometric representations for molecules with equivariant vector-scalar interactive message passing," Nature Communications, Nature, vol. 15(1), pages 1-13, December.
    12. Simon Batzner & Albert Musaelian & Lixin Sun & Mario Geiger & Jonathan P. Mailoa & Mordechai Kornbluth & Nicola Molinari & Tess E. Smidt & Boris Kozinsky, 2022. "E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    13. Albert Musaelian & Simon Batzner & Anders Johansson & Lixin Sun & Cameron J. Owen & Mordechai Kornbluth & Boris Kozinsky, 2023. "Learning local equivariant representations for large-scale atomistic dynamics," Nature Communications, Nature, vol. 14(1), pages 1-15, December.
    14. Charlotte Loh & Thomas Christensen & Rumen Dangovski & Samuel Kim & Marin Soljačić, 2022. "Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    15. Pushkar G. Ghanekar & Siddharth Deshpande & Jeffrey Greeley, 2022. "Adsorbate chemical environment-based machine learning framework for heterogeneous catalysis," Nature Communications, Nature, vol. 13(1), pages 1-12, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:15:y:2024:i:1:d:10.1038_s41467-024-45566-8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.