IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v14y2023i1d10.1038_s41467-023-43214-1.html
   My bibliography  Save this article

A knowledge-guided pre-training framework for improving molecular representation learning

Author

Listed:
  • Han Li

    (Tsinghua University)

  • Ruotian Zhang

    (Tsinghua University)

  • Yaosen Min

    (Tsinghua University)

  • Dacheng Ma

    (Zhejiang Laboratory)

  • Dan Zhao

    (Tsinghua University)

  • Jianyang Zeng

    (Tsinghua University
    Westlake University, Zhejiang Province)

Abstract

Learning effective molecular feature representation to facilitate molecular property prediction is of great significance for drug discovery. Recently, there has been a surge of interest in pre-training graph neural networks (GNNs) via self-supervised learning techniques to overcome the challenge of data scarcity in molecular property prediction. However, current self-supervised learning-based methods suffer from two main obstacles: the lack of a well-defined self-supervised learning strategy and the limited capacity of GNNs. Here, we propose Knowledge-guided Pre-training of Graph Transformer (KPGT), a self-supervised learning framework to alleviate the aforementioned issues and provide generalizable and robust molecular representations. The KPGT framework integrates a graph transformer specifically designed for molecular graphs and a knowledge-guided pre-training strategy, to fully capture both structural and semantic knowledge of molecules. Through extensive computational tests on 63 datasets, KPGT exhibits superior performance in predicting molecular properties across various domains. Moreover, the practical applicability of KPGT in drug discovery has been validated by identifying potential inhibitors of two antitumor targets: hematopoietic progenitor kinase 1 (HPK1) and fibroblast growth factor receptor 1 (FGFR1). Overall, KPGT can provide a powerful and useful tool for advancing the artificial intelligence (AI)-aided drug discovery process.

Suggested Citation

  • Han Li & Ruotian Zhang & Yaosen Min & Dacheng Ma & Dan Zhao & Jianyang Zeng, 2023. "A knowledge-guided pre-training framework for improving molecular representation learning," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
  • Handle: RePEc:nat:natcom:v:14:y:2023:i:1:d:10.1038_s41467-023-43214-1
    DOI: 10.1038/s41467-023-43214-1
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-023-43214-1
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-023-43214-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Tobias Klein & Navratna Vajpai & Jonathan J. Phillips & Gareth Davies & Geoffrey A. Holdgate & Chris Phillips & Julie A. Tucker & Richard A. Norman & Andrew D. Scott & Daniel R. Higazi & David Lowe & , 2015. "Structural and dynamic insights into the energetics of activation loop rearrangement in FGFR1 kinase," Nature Communications, Nature, vol. 6(1), pages 1-12, November.
    2. Alexandre Tkatchenko, 2020. "Machine learning for chemical discovery," Nature Communications, Nature, vol. 11(1), pages 1-4, December.
    3. Keith T. Butler & Daniel W. Davies & Hugh Cartwright & Olexandr Isayev & Aron Walsh, 2018. "Machine learning for molecular and materials science," Nature, Nature, vol. 559(7715), pages 547-555, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Xiaochu Tong & Ning Qu & Xiangtai Kong & Shengkun Ni & Jingyi Zhou & Kun Wang & Lehan Zhang & Yiming Wen & Jiangshan Shi & Sulin Zhang & Xutong Li & Mingyue Zheng, 2024. "Deep representation learning of chemical-induced transcriptional profile for phenotype-based drug discovery," Nature Communications, Nature, vol. 15(1), pages 1-14, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Huziel E. Sauceda & Luis E. Gálvez-González & Stefan Chmiela & Lauro Oliver Paz-Borbón & Klaus-Robert Müller & Alexandre Tkatchenko, 2022. "BIGDML—Towards accurate quantum machine learning force fields for materials," Nature Communications, Nature, vol. 13(1), pages 1-16, December.
    2. Tian Xie & Arthur France-Lanord & Yanming Wang & Jeffrey Lopez & Michael A. Stolberg & Megan Hill & Graham Michael Leverick & Rafael Gomez-Bombarelli & Jeremiah A. Johnson & Yang Shao-Horn & Jeffrey C, 2022. "Accelerating amorphous polymer electrolyte screening by learning to reduce errors in molecular dynamics simulated properties," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    3. Li, Yi & Liu, Kailong & Foley, Aoife M. & Zülke, Alana & Berecibar, Maitane & Nanini-Maury, Elise & Van Mierlo, Joeri & Hoster, Harry E., 2019. "Data-driven health estimation and lifetime prediction of lithium-ion batteries: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 113(C), pages 1-1.
    4. O. V. Mythreyi & M. Rohith Srinivaas & Tigga Amit Kumar & R. Jayaganthan, 2021. "Machine-Learning-Based Prediction of Corrosion Behavior in Additively Manufactured Inconel 718," Data, MDPI, vol. 6(8), pages 1-16, July.
    5. Sarmad Dashti Latif & Ali Najah Ahmed, 2023. "A review of deep learning and machine learning techniques for hydrological inflow forecasting," Environment, Development and Sustainability: A Multidisciplinary Approach to the Theory and Practice of Sustainable Development, Springer, vol. 25(11), pages 12189-12216, November.
    6. Snehi Shrestha & Kieran James Barvenik & Tianle Chen & Haochen Yang & Yang Li & Meera Muthachi Kesavan & Joshua M. Little & Hayden C. Whitley & Zi Teng & Yaguang Luo & Eleonora Tubaldi & Po-Yen Chen, 2024. "Machine intelligence accelerated design of conductive MXene aerogels with programmable properties," Nature Communications, Nature, vol. 15(1), pages 1-14, December.
    7. Oscar Méndez-Lucio & Christos A. Nicolaou & Berton Earnshaw, 2024. "MolE: a foundation model for molecular graphs using disentangled attention," Nature Communications, Nature, vol. 15(1), pages 1-9, December.
    8. Xinyu Chen & Shuaihua Lu & Qian Chen & Qionghua Zhou & Jinlan Wang, 2024. "From bulk effective mass to 2D carrier mobility accurate prediction via adversarial transfer learning," Nature Communications, Nature, vol. 15(1), pages 1-9, December.
    9. Zhiyuan Han & An Chen & Zejian Li & Mengtian Zhang & Zhilong Wang & Lixue Yang & Runhua Gao & Yeyang Jia & Guanjun Ji & Zhoujie Lao & Xiao Xiao & Kehao Tao & Jing Gao & Wei Lv & Tianshuai Wang & Jinji, 2024. "Machine learning-based design of electrocatalytic materials towards high-energy lithium||sulfur batteries development," Nature Communications, Nature, vol. 15(1), pages 1-13, December.
    10. Niklas W. A. Gebauer & Michael Gastegger & Stefaan S. P. Hessmann & Klaus-Robert Müller & Kristof T. Schütt, 2022. "Inverse design of 3d molecular structures with conditional generative neural networks," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    11. Gabriele Orlando & Daniele Raimondi & Ramon Duran-Romaña & Yves Moreau & Joost Schymkowitz & Frederic Rousseau, 2022. "PyUUL provides an interface between biological structures and deep learning algorithms," Nature Communications, Nature, vol. 13(1), pages 1-9, December.
    12. Gang Wang & Shinya Mine & Duotian Chen & Yuan Jing & Kah Wei Ting & Taichi Yamaguchi & Motoshi Takao & Zen Maeno & Ichigaku Takigawa & Koichi Matsushita & Ken-ichi Shimizu & Takashi Toyao, 2023. "Accelerated discovery of multi-elemental reverse water-gas shift catalysts using extrapolative machine learning approach," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    13. Sukriti Manna & Troy D. Loeffler & Rohit Batra & Suvo Banik & Henry Chan & Bilvin Varughese & Kiran Sasikumar & Michael Sternberg & Tom Peterka & Mathew J. Cherukara & Stephen K. Gray & Bobby G. Sumpt, 2022. "Learning in continuous action space for developing high dimensional potential energy models," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    14. Cong, Jian & Ma, Tianzeng & Chang, Zheshao & Zhang, Qiangqiang & Akhatov, Jasurjon S. & Fu, Mingkai & Li, Xin, 2023. "Neural network and experimental thermodynamics study of YCrO3-δ for efficient solar thermochemical hydrogen production," Renewable Energy, Elsevier, vol. 213(C), pages 1-10.
    15. Ribeiro, Haroldo V. & Lopes, Diego D. & Pessa, Arthur A.B. & Martins, Alvaro F. & da Cunha, Bruno R. & Gonçalves, Sebastián & Lenzi, Ervin K. & Hanley, Quentin S. & Perc, Matjaž, 2023. "Deep learning criminal networks," Chaos, Solitons & Fractals, Elsevier, vol. 172(C).
    16. Xuan-Kun Li & Jian-Xu Ma & Xiang-Yu Li & Jun-Jie Hu & Chuan-Yang Ding & Feng-Kai Han & Xiao-Min Guo & Xi Tan & Xian-Min Jin, 2024. "High-efficiency reinforcement learning with hybrid architecture photonic integrated circuit," Nature Communications, Nature, vol. 15(1), pages 1-10, December.
    17. Hao Xu & Jinglong Lin & Dongxiao Zhang & Fanyang Mo, 2023. "Retention time prediction for chromatographic enantioseparation by quantile geometry-enhanced graph neural network," Nature Communications, Nature, vol. 14(1), pages 1-15, December.
    18. Zeyin Yan & Dacong Wei & Xin Li & Lung Wa Chung, 2024. "Accelerating reliable multiscale quantum refinement of protein–drug systems enabled by machine learning," Nature Communications, Nature, vol. 15(1), pages 1-12, December.
    19. Xinyu Chen & Yufeng Xie & Yaochen Sheng & Hongwei Tang & Zeming Wang & Yu Wang & Yin Wang & Fuyou Liao & Jingyi Ma & Xiaojiao Guo & Ling Tong & Hanqi Liu & Hao Liu & Tianxiang Wu & Jiaxin Cao & Sitong, 2021. "Wafer-scale functional circuits based on two dimensional semiconductors with fabrication optimized by machine learning," Nature Communications, Nature, vol. 12(1), pages 1-8, December.
    20. Kangming Li & Daniel Persaud & Kamal Choudhary & Brian DeCost & Michael Greenwood & Jason Hattrick-Simpers, 2023. "Exploiting redundancy in large materials datasets for efficient machine learning with less data," Nature Communications, Nature, vol. 14(1), pages 1-10, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:14:y:2023:i:1:d:10.1038_s41467-023-43214-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.