IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v16y2025i1d10.1038_s41467-025-56345-4.html
   My bibliography  Save this article

Rapid learning with phase-change memory-based in-memory computing through learning-to-learn

Author

Listed:
  • Thomas Ortner

    (IBM Research Europe - Zurich)

  • Horst Petschenig

    (Graz University of Technology)

  • Athanasios Vasilopoulos

    (IBM Research Europe - Zurich)

  • Roland Renner

    (Graz University of Technology)

  • Špela Brglez

    (Graz University of Technology)

  • Thomas Limbacher

    (Graz University of Technology)

  • Enrique Piñero

    (Universidad de Sevilla)

  • Alejandro Linares-Barranco

    (Universidad de Sevilla)

  • Angeliki Pantazi

    (IBM Research Europe - Zurich)

  • Robert Legenstein

    (Graz University of Technology)

Abstract

There is a growing demand for low-power, autonomously learning artificial intelligence (AI) systems that can be applied at the edge and rapidly adapt to the specific situation at deployment site. However, current AI models struggle in such scenarios, often requiring extensive fine-tuning, computational resources, and data. In contrast, humans can effortlessly adjust to new tasks by transferring knowledge from related ones. The concept of learning-to-learn (L2L) mimics this process and enables AI models to rapidly adapt with only little computational effort and data. In-memory computing neuromorphic hardware (NMHW) is inspired by the brain’s operating principles and mimics its physical co-location of memory and compute. In this work, we pair L2L with in-memory computing NMHW based on phase-change memory devices to build efficient AI models that can rapidly adapt to new tasks. We demonstrate the versatility of our approach in two scenarios: a convolutional neural network performing image classification and a biologically-inspired spiking neural network generating motor commands for a real robotic arm. Both models rapidly learn with few parameter updates. Deployed on the NMHW, they perform on-par with their software equivalents. Moreover, meta-training of these models can be performed in software with high-precision, alleviating the need for accurate hardware models.

Suggested Citation

  • Thomas Ortner & Horst Petschenig & Athanasios Vasilopoulos & Roland Renner & Špela Brglez & Thomas Limbacher & Enrique Piñero & Alejandro Linares-Barranco & Angeliki Pantazi & Robert Legenstein, 2025. "Rapid learning with phase-change memory-based in-memory computing through learning-to-learn," Nature Communications, Nature, vol. 16(1), pages 1-16, December.
  • Handle: RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-56345-4
    DOI: 10.1038/s41467-025-56345-4
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-025-56345-4
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-025-56345-4?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Vinay Joshi & Manuel Le Gallo & Simon Haefeli & Irem Boybat & S. R. Nandakumar & Christophe Piveteau & Martino Dazzi & Bipin Rajendran & Abu Sebastian & Evangelos Eleftheriou, 2020. "Accurate deep neural network inference using computational phase-change memory," Nature Communications, Nature, vol. 11(1), pages 1-13, December.
    2. S. Ambrogio & P. Narayanan & A. Okazaki & A. Fasoli & C. Mackin & K. Hosokawa & A. Nomura & T. Yasuda & A. Chen & A. Friz & M. Ishii & J. Luquin & Y. Kohda & N. Saulnier & K. Brew & S. Choi & I. Ok & , 2023. "An analog-AI chip for energy-efficient speech recognition and transcription," Nature, Nature, vol. 620(7975), pages 768-775, August.
    3. Guillaume Bellec & Franz Scherr & Anand Subramoney & Elias Hajek & Darjan Salaj & Robert Legenstein & Wolfgang Maass, 2020. "A solution to the learning dilemma for recurrent networks of spiking neurons," Nature Communications, Nature, vol. 11(1), pages 1-15, December.
    4. Yujie Wu & Rong Zhao & Jun Zhu & Feng Chen & Mingkun Xu & Guoqi Li & Sen Song & Lei Deng & Guanrui Wang & Hao Zheng & Songchen Ma & Jing Pei & Youhui Zhang & Mingguo Zhao & Luping Shi, 2022. "Brain-inspired global-local learning incorporated with neuromorphic computing," Nature Communications, Nature, vol. 13(1), pages 1-14, December.
    5. Weier Wan & Rajkumar Kubendran & Clemens Schaefer & Sukru Burc Eryilmaz & Wenqiang Zhang & Dabin Wu & Stephen Deiss & Priyanka Raina & He Qian & Bin Gao & Siddharth Joshi & Huaqiang Wu & H.-S. Philip , 2022. "A compute-in-memory chip based on resistive random-access memory," Nature, Nature, vol. 608(7923), pages 504-512, August.
    6. Ruibin Mao & Bo Wen & Arman Kazemi & Yahui Zhao & Ann Franchesca Laguna & Rui Lin & Ngai Wong & Michael Niemier & X. Sharon Hu & Xia Sheng & Catherine E. Graves & John Paul Strachan & Can Li, 2022. "Experimentally validated memristive memory augmented neural network with efficient hashing and similarity search," Nature Communications, Nature, vol. 13(1), pages 1-13, December.
    7. Geethan Karunaratne & Manuel Schmuck & Manuel Le Gallo & Giovanni Cherubini & Luca Benini & Abu Sebastian & Abbas Rahimi, 2021. "Robust high-dimensional memory-augmented neural networks," Nature Communications, Nature, vol. 12(1), pages 1-12, December.
    8. Dmitri B. Strukov & Gregory S. Snider & Duncan R. Stewart & R. Stanley Williams, 2008. "The missing memristor found," Nature, Nature, vol. 453(7191), pages 80-83, May.
    9. Malte J. Rasch & Charles Mackin & Manuel Gallo & An Chen & Andrea Fasoli & Frédéric Odermatt & Ning Li & S. R. Nandakumar & Pritish Narayanan & Hsinyu Tsai & Geoffrey W. Burr & Abu Sebastian & Vijay N, 2023. "Hardware-aware training for large-scale and diverse deep learning inference workloads using in-memory computing-based accelerators," Nature Communications, Nature, vol. 14(1), pages 1-18, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Malte J. Rasch & Fabio Carta & Omobayode Fagbohungbe & Tayfun Gokmen, 2024. "Fast and robust analog in-memory deep neural network training," Nature Communications, Nature, vol. 15(1), pages 1-15, December.
    2. Yijun Li & Jianshi Tang & Bin Gao & Jian Yao & Anjunyi Fan & Bonan Yan & Yuchao Yang & Yue Xi & Yuankun Li & Jiaming Li & Wen Sun & Yiwei Du & Zhengwu Liu & Qingtian Zhang & Song Qiu & Qingwen Li & He, 2023. "Monolithic three-dimensional integration of RRAM-based hybrid memory architecture for one-shot learning," Nature Communications, Nature, vol. 14(1), pages 1-9, December.
    3. Thomas Dalgaty & Filippo Moro & Yiğit Demirağ & Alessio Pra & Giacomo Indiveri & Elisa Vianello & Melika Payvand, 2024. "Mosaic: in-memory computing and routing for small-world spike-based neuromorphic systems," Nature Communications, Nature, vol. 15(1), pages 1-12, December.
    4. Djohan Bonnet & Tifenn Hirtzlin & Atreya Majumdar & Thomas Dalgaty & Eduardo Esmanhotto & Valentina Meli & Niccolo Castellani & Simon Martin & Jean-François Nodin & Guillaume Bourgeois & Jean-Michel P, 2023. "Bringing uncertainty quantification to the extreme-edge with memristor-based Bayesian neural networks," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    5. Simone D’Agostino & Filippo Moro & Tristan Torchet & Yiğit Demirağ & Laurent Grenouillet & Niccolò Castellani & Giacomo Indiveri & Elisa Vianello & Melika Payvand, 2024. "DenRAM: neuromorphic dendritic architecture with RRAM for efficient temporal processing with delays," Nature Communications, Nature, vol. 15(1), pages 1-12, December.
    6. Corey Lammie & Julian Büchel & Athanasios Vasilopoulos & Manuel Gallo & Abu Sebastian, 2025. "The inherent adversarial robustness of analog in-memory computing," Nature Communications, Nature, vol. 16(1), pages 1-12, December.
    7. Malte J. Rasch & Charles Mackin & Manuel Gallo & An Chen & Andrea Fasoli & Frédéric Odermatt & Ning Li & S. R. Nandakumar & Pritish Narayanan & Hsinyu Tsai & Geoffrey W. Burr & Abu Sebastian & Vijay N, 2023. "Hardware-aware training for large-scale and diverse deep learning inference workloads using in-memory computing-based accelerators," Nature Communications, Nature, vol. 14(1), pages 1-18, December.
    8. Rohit Abraham John & Yiğit Demirağ & Yevhen Shynkarenko & Yuliia Berezovska & Natacha Ohannessian & Melika Payvand & Peng Zeng & Maryna I. Bodnarchuk & Frank Krumeich & Gökhan Kara & Ivan Shorubalko &, 2022. "Reconfigurable halide perovskite nanocrystal memristors for neuromorphic computing," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    9. Yongxiang Li & Shiqing Wang & Ke Yang & Yuchao Yang & Zhong Sun, 2024. "An emergent attractor network in a passive resistive switching circuit," Nature Communications, Nature, vol. 15(1), pages 1-9, December.
    10. Junyi Yang & Ruibin Mao & Mingrui Jiang & Yichuan Cheng & Pao-Sheng Vincent Sun & Shuai Dong & Giacomo Pedretti & Xia Sheng & Jim Ignowski & Haoliang Li & Can Li & Arindam Basu, 2025. "Efficient nonlinear function approximation in analog resistive crossbars for recurrent neural networks," Nature Communications, Nature, vol. 16(1), pages 1-15, December.
    11. Yuyan Zhu & Yang Wang & Xingchen Pang & Yongbo Jiang & Xiaoxian Liu & Qing Li & Zhen Wang & Chunsen Liu & Weida Hu & Peng Zhou, 2024. "Non-volatile 2D MoS2/black phosphorus heterojunction photodiodes in the near- to mid-infrared region," Nature Communications, Nature, vol. 15(1), pages 1-10, December.
    12. Feng, Liang & Hu, Cheng & Yu, Juan & Jiang, Haijun & Wen, Shiping, 2021. "Fixed-time Synchronization of Coupled Memristive Complex-valued Neural Networks," Chaos, Solitons & Fractals, Elsevier, vol. 148(C).
    13. Hu, Yongbing & Li, Qian & Ding, Dawei & Jiang, Li & Yang, Zongli & Zhang, Hongwei & Zhang, Zhixin, 2021. "Multiple coexisting analysis of a fractional-order coupled memristive system and its application in image encryption," Chaos, Solitons & Fractals, Elsevier, vol. 152(C).
    14. Zhang, Ge & Ma, Jun & Alsaedi, Ahmed & Ahmad, Bashir & Alzahrani, Faris, 2018. "Dynamical behavior and application in Josephson Junction coupled by memristor," Applied Mathematics and Computation, Elsevier, vol. 321(C), pages 290-299.
    15. Zhiwei Chen & Wenjie Li & Zhen Fan & Shuai Dong & Yihong Chen & Minghui Qin & Min Zeng & Xubing Lu & Guofu Zhou & Xingsen Gao & Jun-Ming Liu, 2023. "All-ferroelectric implementation of reservoir computing," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    16. Xiangpeng Liang & Yanan Zhong & Jianshi Tang & Zhengwu Liu & Peng Yao & Keyang Sun & Qingtian Zhang & Bin Gao & Hadi Heidari & He Qian & Huaqiang Wu, 2022. "Rotating neurons for all-analog implementation of cyclic reservoir computing," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    17. Matteo Saponati & Martin Vinck, 2023. "Sequence anticipation and spike-timing-dependent plasticity emerge from a predictive learning rule," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    18. Choi, Woo Sik & Jang, Jun Tae & Kim, Donguk & Yang, Tae Jun & Kim, Changwook & Kim, Hyungjin & Kim, Dae Hwan, 2022. "Influence of Al2O3 layer on InGaZnO memristor crossbar array for neuromorphic applications," Chaos, Solitons & Fractals, Elsevier, vol. 156(C).
    19. Qin, Xiaoli & Wang, Cong & Li, Lixiang & Peng, Haipeng & Yang, Yixian & Ye, Lu, 2018. "Finite-time modified projective synchronization of memristor-based neural network with multi-links and leakage delay," Chaos, Solitons & Fractals, Elsevier, vol. 116(C), pages 302-315.
    20. Ui Yeon Won & Quoc An Vu & Sung Bum Park & Mi Hyang Park & Van Dam Do & Hyun Jun Park & Heejun Yang & Young Hee Lee & Woo Jong Yu, 2023. "Multi-neuron connection using multi-terminal floating–gate memristor for unsupervised learning," Nature Communications, Nature, vol. 14(1), pages 1-11, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-56345-4. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.