IDEAS home Printed from https://ideas.repec.org/a/nat/nature/v620y2023i7976d10.1038_s41586-023-06377-x.html
   My bibliography  Save this article

A high-performance speech neuroprosthesis

Author

Listed:
  • Francis R. Willett

    (Howard Hughes Medical Institute at Stanford University)

  • Erin M. Kunz

    (Stanford University
    Stanford University)

  • Chaofei Fan

    (Stanford University)

  • Donald T. Avansino

    (Howard Hughes Medical Institute at Stanford University)

  • Guy H. Wilson

    (Stanford University)

  • Eun Young Choi

    (Stanford University)

  • Foram Kamdar

    (Stanford University)

  • Matthew F. Glasser

    (Washington University in St. Louis
    Washington University in St. Louis)

  • Leigh R. Hochberg

    (Providence VA Medical Center
    Brown University
    Harvard Medical School)

  • Shaul Druckmann

    (Stanford University)

  • Krishna V. Shenoy

    (Howard Hughes Medical Institute at Stanford University
    Stanford University
    Stanford University
    Stanford University)

  • Jaimie M. Henderson

    (Stanford University
    Stanford University)

Abstract

Speech brain–computer interfaces (BCIs) have the potential to restore rapid communication to people with paralysis by decoding neural activity evoked by attempted speech into text1,2 or sound3,4. Early demonstrations, although promising, have not yet achieved accuracies sufficiently high for communication of unconstrained sentences from a large vocabulary1–7. Here we demonstrate a speech-to-text BCI that records spiking activity from intracortical microelectrode arrays. Enabled by these high-resolution recordings, our study participant—who can no longer speak intelligibly owing to amyotrophic lateral sclerosis—achieved a 9.1% word error rate on a 50-word vocabulary (2.7 times fewer errors than the previous state-of-the-art speech BCI2) and a 23.8% word error rate on a 125,000-word vocabulary (the first successful demonstration, to our knowledge, of large-vocabulary decoding). Our participant’s attempted speech was decoded at 62 words per minute, which is 3.4 times as fast as the previous record8 and begins to approach the speed of natural conversation (160 words per minute9). Finally, we highlight two aspects of the neural code for speech that are encouraging for speech BCIs: spatially intermixed tuning to speech articulators that makes accurate decoding possible from only a small region of cortex, and a detailed articulatory representation of phonemes that persists years after paralysis. These results show a feasible path forward for restoring rapid communication to people with paralysis who can no longer speak.

Suggested Citation

  • Francis R. Willett & Erin M. Kunz & Chaofei Fan & Donald T. Avansino & Guy H. Wilson & Eun Young Choi & Foram Kamdar & Matthew F. Glasser & Leigh R. Hochberg & Shaul Druckmann & Krishna V. Shenoy & Ja, 2023. "A high-performance speech neuroprosthesis," Nature, Nature, vol. 620(7976), pages 1031-1036, August.
  • Handle: RePEc:nat:nature:v:620:y:2023:i:7976:d:10.1038_s41586-023-06377-x
    DOI: 10.1038/s41586-023-06377-x
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41586-023-06377-x
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1038/s41586-023-06377-x?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Robert Burgan, 2023. "Once More about Human Nature, Enhancement and Substitution [Ešte raz o ľudskej prirodzenosti, zdokonaľovaní a substitúcii]," E-LOGOS, Prague University of Economics and Business, vol. 2023(2), pages 4-55.
    2. repec:prg:jnlelg:v:preprint:id:499 is not listed on IDEAS
    3. Xiner Wang & Guo Bai & Jizhi Liang & Qianyang Xie & Zhaohan Chen & Erda Zhou & Meng Li & Xiaoling Wei & Liuyang Sun & Zhiyuan Zhang & Chi Yang & Tiger H. Tao & Zhitao Zhou, 2024. "Gustatory interface for operative assessment and taste decoding in patients with tongue cancer," Nature Communications, Nature, vol. 15(1), pages 1-10, December.
    4. Shanqing Cai & Subhashini Venugopalan & Katie Seaver & Xiang Xiao & Katrin Tomanek & Sri Jalasutram & Meredith Ringel Morris & Shaun Kane & Ajit Narayanan & Robert L. MacDonald & Emily Kornman & Danie, 2024. "Using large language models to accelerate communication for eye gaze typing users with ALS," Nature Communications, Nature, vol. 15(1), pages 1-18, December.
    5. Sarah K. Wandelt & David A. Bjånes & Kelsie Pejsa & Brian Lee & Charles Liu & Richard A. Andersen, 2024. "Representation of internal speech by single neurons in human supramarginal gyrus," Nature Human Behaviour, Nature, vol. 8(6), pages 1136-1149, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:nature:v:620:y:2023:i:7976:d:10.1038_s41586-023-06377-x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.