IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v16y2025i1d10.1038_s41467-025-56595-2.html
   My bibliography  Save this article

The inherent adversarial robustness of analog in-memory computing

Author

Listed:
  • Corey Lammie

    (IBM Research Europe)

  • Julian Büchel

    (IBM Research Europe)

  • Athanasios Vasilopoulos

    (IBM Research Europe)

  • Manuel Gallo

    (IBM Research Europe)

  • Abu Sebastian

    (IBM Research Europe)

Abstract

A key challenge for deep neural network algorithms is their vulnerability to adversarial attacks. Inherently non-deterministic compute substrates, such as those based on analog in-memory computing, have been speculated to provide significant adversarial robustness when performing deep neural network inference. In this paper, we experimentally validate this conjecture for the first time on an analog in-memory computing chip based on phase change memory devices. We demonstrate higher adversarial robustness against different types of adversarial attacks when implementing an image classification network. Additional robustness is also observed when performing hardware-in-the-loop attacks, for which the attacker is assumed to have full access to the hardware. A careful study of the various noise sources indicate that a combination of stochastic noise sources (both recurrent and non-recurrent) are responsible for the adversarial robustness and that their type and magnitude disproportionately effects this property. Finally, it is demonstrated, via simulations, that when a much larger transformer network is used to implement a natural language processing task, additional robustness is still observed.

Suggested Citation

  • Corey Lammie & Julian Büchel & Athanasios Vasilopoulos & Manuel Gallo & Abu Sebastian, 2025. "The inherent adversarial robustness of analog in-memory computing," Nature Communications, Nature, vol. 16(1), pages 1-12, December.
  • Handle: RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-56595-2
    DOI: 10.1038/s41467-025-56595-2
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-025-56595-2
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-025-56595-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Vinay Joshi & Manuel Le Gallo & Simon Haefeli & Irem Boybat & S. R. Nandakumar & Christophe Piveteau & Martino Dazzi & Bipin Rajendran & Abu Sebastian & Evangelos Eleftheriou, 2020. "Accurate deep neural network inference using computational phase-change memory," Nature Communications, Nature, vol. 11(1), pages 1-13, December.
    2. Malte J. Rasch & Charles Mackin & Manuel Gallo & An Chen & Andrea Fasoli & Frédéric Odermatt & Ning Li & S. R. Nandakumar & Pritish Narayanan & Hsinyu Tsai & Geoffrey W. Burr & Abu Sebastian & Vijay N, 2023. "Hardware-aware training for large-scale and diverse deep learning inference workloads using in-memory computing-based accelerators," Nature Communications, Nature, vol. 14(1), pages 1-18, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Thomas Ortner & Horst Petschenig & Athanasios Vasilopoulos & Roland Renner & Špela Brglez & Thomas Limbacher & Enrique Piñero & Alejandro Linares-Barranco & Angeliki Pantazi & Robert Legenstein, 2025. "Rapid learning with phase-change memory-based in-memory computing through learning-to-learn," Nature Communications, Nature, vol. 16(1), pages 1-16, December.
    2. Djohan Bonnet & Tifenn Hirtzlin & Atreya Majumdar & Thomas Dalgaty & Eduardo Esmanhotto & Valentina Meli & Niccolo Castellani & Simon Martin & Jean-François Nodin & Guillaume Bourgeois & Jean-Michel P, 2023. "Bringing uncertainty quantification to the extreme-edge with memristor-based Bayesian neural networks," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    3. Xiangpeng Liang & Yanan Zhong & Jianshi Tang & Zhengwu Liu & Peng Yao & Keyang Sun & Qingtian Zhang & Bin Gao & Hadi Heidari & He Qian & Huaqiang Wu, 2022. "Rotating neurons for all-analog implementation of cyclic reservoir computing," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    4. Choi, Woo Sik & Jang, Jun Tae & Kim, Donguk & Yang, Tae Jun & Kim, Changwook & Kim, Hyungjin & Kim, Dae Hwan, 2022. "Influence of Al2O3 layer on InGaZnO memristor crossbar array for neuromorphic applications," Chaos, Solitons & Fractals, Elsevier, vol. 156(C).
    5. Malte J. Rasch & Fabio Carta & Omobayode Fagbohungbe & Tayfun Gokmen, 2024. "Fast and robust analog in-memory deep neural network training," Nature Communications, Nature, vol. 15(1), pages 1-15, December.
    6. Charles Mackin & Malte J. Rasch & An Chen & Jonathan Timcheck & Robert L. Bruce & Ning Li & Pritish Narayanan & Stefano Ambrogio & Manuel Gallo & S. R. Nandakumar & Andrea Fasoli & Jose Luquin & Alexa, 2022. "Optimised weight programming for analogue memory-based deep neural networks," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    7. Shi-Yuan Ma & Tianyu Wang & Jérémie Laydevant & Logan G. Wright & Peter L. McMahon, 2025. "Quantum-limited stochastic optical neural networks operating at a few quanta per activation," Nature Communications, Nature, vol. 16(1), pages 1-12, December.
    8. Malte J. Rasch & Charles Mackin & Manuel Gallo & An Chen & Andrea Fasoli & Frédéric Odermatt & Ning Li & S. R. Nandakumar & Pritish Narayanan & Hsinyu Tsai & Geoffrey W. Burr & Abu Sebastian & Vijay N, 2023. "Hardware-aware training for large-scale and diverse deep learning inference workloads using in-memory computing-based accelerators," Nature Communications, Nature, vol. 14(1), pages 1-18, December.
    9. Ik-Jyae Kim & Min-Kyu Kim & Jang-Sik Lee, 2023. "Highly-scaled and fully-integrated 3-dimensional ferroelectric transistor array for hardware implementation of neural networks," Nature Communications, Nature, vol. 14(1), pages 1-10, December.
    10. Thomas Dalgaty & Filippo Moro & Yiğit Demirağ & Alessio Pra & Giacomo Indiveri & Elisa Vianello & Melika Payvand, 2024. "Mosaic: in-memory computing and routing for small-world spike-based neuromorphic systems," Nature Communications, Nature, vol. 15(1), pages 1-12, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-56595-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.