IDEAS home Printed from https://ideas.repec.org/a/nat/nature/v615y2023i7954d10.1038_s41586-023-05759-5.html
   My bibliography  Save this article

Thousands of conductance levels in memristors integrated on CMOS

Author

Listed:
  • Mingyi Rao

    (TetraMem
    University of Massachusetts)

  • Hao Tang

    (Massachusetts Institute of Technology)

  • Jiangbin Wu

    (University of Southern California)

  • Wenhao Song

    (University of Southern California)

  • Max Zhang

    (TetraMem)

  • Wenbo Yin

    (TetraMem)

  • Ye Zhuo

    (University of Southern California)

  • Fatemeh Kiani

    (University of Massachusetts)

  • Benjamin Chen

    (University of Massachusetts)

  • Xiangqi Jiang

    (TetraMem)

  • Hefei Liu

    (University of Southern California)

  • Hung-Yu Chen

    (University of Southern California)

  • Rivu Midya

    (University of Massachusetts)

  • Fan Ye

    (University of Massachusetts)

  • Hao Jiang

    (University of Massachusetts)

  • Zhongrui Wang

    (University of Massachusetts)

  • Mingche Wu

    (TetraMem)

  • Miao Hu

    (TetraMem)

  • Han Wang

    (University of Southern California)

  • Qiangfei Xia

    (TetraMem
    University of Massachusetts)

  • Ning Ge

    (TetraMem)

  • Ju Li

    (Massachusetts Institute of Technology)

  • J. Joshua Yang

    (TetraMem
    University of Massachusetts
    University of Southern California)

Abstract

Neural networks based on memristive devices1–3 have the ability to improve throughput and energy efficiency for machine learning4,5 and artificial intelligence6, especially in edge applications7–21. Because training a neural network model from scratch is costly in terms of hardware resources, time and energy, it is impractical to do it individually on billions of memristive neural networks distributed at the edge. A practical approach would be to download the synaptic weights obtained from the cloud training and program them directly into memristors for the commercialization of edge applications. Some post-tuning in memristor conductance could be done afterwards or during applications to adapt to specific situations. Therefore, in neural network applications, memristors require high-precision programmability to guarantee uniform and accurate performance across a large number of memristive networks22–28. This requires many distinguishable conductance levels on each memristive device, not only laboratory-made devices but also devices fabricated in factories. Analog memristors with many conductance states also benefit other applications, such as neural network training, scientific computing and even ‘mortal computing’25,29,30. Here we report 2,048 conductance levels achieved with memristors in fully integrated chips with 256 × 256 memristor arrays monolithically integrated on complementary metal–oxide–semiconductor (CMOS) circuits in a commercial foundry. We have identified the underlying physics that previously limited the number of conductance levels that could be achieved in memristors and developed electrical operation protocols to avoid such limitations. These results provide insights into the fundamental understanding of the microscopic picture of memristive switching as well as approaches to enable high-precision memristors for various applications.

Suggested Citation

  • Mingyi Rao & Hao Tang & Jiangbin Wu & Wenhao Song & Max Zhang & Wenbo Yin & Ye Zhuo & Fatemeh Kiani & Benjamin Chen & Xiangqi Jiang & Hefei Liu & Hung-Yu Chen & Rivu Midya & Fan Ye & Hao Jiang & Zhong, 2023. "Thousands of conductance levels in memristors integrated on CMOS," Nature, Nature, vol. 615(7954), pages 823-829, March.
  • Handle: RePEc:nat:nature:v:615:y:2023:i:7954:d:10.1038_s41586-023-05759-5
    DOI: 10.1038/s41586-023-05759-5
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41586-023-05759-5
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1038/s41586-023-05759-5?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Peng Chen & Fenghao Liu & Peng Lin & Peihong Li & Yu Xiao & Bihua Zhang & Gang Pan, 2023. "Open-loop analog programmable electrochemical memory array," Nature Communications, Nature, vol. 14(1), pages 1-9, December.
    2. Yulin Feng & Yizhou Zhang & Zheng Zhou & Peng Huang & Lifeng Liu & Xiaoyan Liu & Jinfeng Kang, 2024. "Memristor-based storage system with convolutional autoencoder-based image compression network," Nature Communications, Nature, vol. 15(1), pages 1-13, December.
    3. Jongmin Lee & Bum Ho Jeong & Eswaran Kamaraj & Dohyung Kim & Hakjun Kim & Sanghyuk Park & Hui Joon Park, 2023. "Light-enhanced molecular polarity enabling multispectral color-cognitive memristor for neuromorphic visual system," Nature Communications, Nature, vol. 14(1), pages 1-19, December.
    4. Jaeseoung Park & Ashwani Kumar & Yucheng Zhou & Sangheon Oh & Jeong-Hoon Kim & Yuhan Shi & Soumil Jain & Gopabandhu Hota & Erbin Qiu & Amelie L. Nagle & Ivan K. Schuller & Catherine D. Schuman & Gert , 2024. "Multi-level, forming and filament free, bulk switching trilayer RRAM for neuromorphic computing at the edge," Nature Communications, Nature, vol. 15(1), pages 1-11, December.
    5. Christoph Stöckl & Yukun Yang & Wolfgang Maass, 2024. "Local prediction-learning in high-dimensional spaces enables neural networks to plan," Nature Communications, Nature, vol. 15(1), pages 1-16, December.
    6. Mingrui Jiang & Keyi Shan & Chengping He & Can Li, 2023. "Efficient combinatorial optimization by quantum-inspired parallel annealing in analogue memristor crossbar," Nature Communications, Nature, vol. 14(1), pages 1-11, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:nature:v:615:y:2023:i:7954:d:10.1038_s41586-023-05759-5. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.