IDEAS home Printed from https://ideas.repec.org/a/nat/nathum/v8y2024i6d10.1038_s41562-024-01867-y.html
   My bibliography  Save this article

Representation of internal speech by single neurons in human supramarginal gyrus

Author

Listed:
  • Sarah K. Wandelt

    (California Institute of Technology
    California Institute of Technology)

  • David A. Bjånes

    (California Institute of Technology
    California Institute of Technology
    Rancho Los Amigos National Rehabilitation Center)

  • Kelsie Pejsa

    (California Institute of Technology
    California Institute of Technology)

  • Brian Lee

    (California Institute of Technology
    Keck School of Medicine of USC
    Keck School of Medicine of USC)

  • Charles Liu

    (California Institute of Technology
    Rancho Los Amigos National Rehabilitation Center
    Keck School of Medicine of USC
    Keck School of Medicine of USC)

  • Richard A. Andersen

    (California Institute of Technology
    California Institute of Technology)

Abstract

Speech brain–machine interfaces (BMIs) translate brain signals into words or audio outputs, enabling communication for people having lost their speech abilities due to diseases or injury. While important advances in vocalized, attempted and mimed speech decoding have been achieved, results for internal speech decoding are sparse and have yet to achieve high functionality. Notably, it is still unclear from which brain areas internal speech can be decoded. Here two participants with tetraplegia with implanted microelectrode arrays located in the supramarginal gyrus (SMG) and primary somatosensory cortex (S1) performed internal and vocalized speech of six words and two pseudowords. In both participants, we found significant neural representation of internal and vocalized speech, at the single neuron and population level in the SMG. From recorded population activity in the SMG, the internally spoken and vocalized words were significantly decodable. In an offline analysis, we achieved average decoding accuracies of 55% and 24% for each participant, respectively (chance level 12.5%), and during an online internal speech BMI task, we averaged 79% and 23% accuracy, respectively. Evidence of shared neural representations between internal speech, word reading and vocalized speech processes was found in participant 1. SMG represented words as well as pseudowords, providing evidence for phonetic encoding. Furthermore, our decoder achieved high classification with multiple internal speech strategies (auditory imagination/visual imagination). Activity in S1 was modulated by vocalized but not internal speech in both participants, suggesting no articulator movements of the vocal tract occurred during internal speech production. This work represents a proof-of-concept for a high-performance internal speech BMI.

Suggested Citation

  • Sarah K. Wandelt & David A. Bjånes & Kelsie Pejsa & Brian Lee & Charles Liu & Richard A. Andersen, 2024. "Representation of internal speech by single neurons in human supramarginal gyrus," Nature Human Behaviour, Nature, vol. 8(6), pages 1136-1149, June.
  • Handle: RePEc:nat:nathum:v:8:y:2024:i:6:d:10.1038_s41562-024-01867-y
    DOI: 10.1038/s41562-024-01867-y
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41562-024-01867-y
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1038/s41562-024-01867-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Francis R. Willett & Erin M. Kunz & Chaofei Fan & Donald T. Avansino & Guy H. Wilson & Eun Young Choi & Foram Kamdar & Matthew F. Glasser & Leigh R. Hochberg & Shaul Druckmann & Krishna V. Shenoy & Ja, 2023. "A high-performance speech neuroprosthesis," Nature, Nature, vol. 620(7976), pages 1031-1036, August.
    2. Sean L. Metzger & Jessie R. Liu & David A. Moses & Maximilian E. Dougherty & Margaret P. Seaton & Kaylo T. Littlejohn & Josh Chartier & Gopala K. Anumanchipalli & Adelyn Tu-Chan & Karunesh Ganguly & E, 2022. "Generalizable spelling using a speech neuroprosthesis in an individual with severe limb and vocal paralysis," Nature Communications, Nature, vol. 13(1), pages 1-15, December.
    3. Gopala K. Anumanchipalli & Josh Chartier & Edward F. Chang, 2019. "Speech synthesis from neural decoding of spoken sentences," Nature, Nature, vol. 568(7753), pages 493-498, April.
    4. Sean L. Metzger & Kaylo T. Littlejohn & Alexander B. Silva & David A. Moses & Margaret P. Seaton & Ran Wang & Maximilian E. Dougherty & Jessie R. Liu & Peter Wu & Michael A. Berger & Inga Zhuravleva &, 2023. "A high-performance neuroprosthesis for speech decoding and avatar control," Nature, Nature, vol. 620(7976), pages 1037-1046, August.
    5. Timothée Proix & Jaime Delgado Saa & Andy Christen & Stephanie Martin & Brian N. Pasley & Robert T. Knight & Xing Tian & David Poeppel & Werner K. Doyle & Orrin Devinsky & Luc H. Arnal & Pierre Mégeva, 2022. "Imagined speech can be decoded from low- and cross-frequency intracranial EEG features," Nature Communications, Nature, vol. 13(1), pages 1-14, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Junfeng Lu & Yuanning Li & Zehao Zhao & Yan Liu & Yanming Zhu & Ying Mao & Jinsong Wu & Edward F. Chang, 2023. "Neural control of lexical tone production in human laryngeal motor cortex," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    2. Sean L. Metzger & Jessie R. Liu & David A. Moses & Maximilian E. Dougherty & Margaret P. Seaton & Kaylo T. Littlejohn & Josh Chartier & Gopala K. Anumanchipalli & Adelyn Tu-Chan & Karunesh Ganguly & E, 2022. "Generalizable spelling using a speech neuroprosthesis in an individual with severe limb and vocal paralysis," Nature Communications, Nature, vol. 13(1), pages 1-15, December.
    3. Taemin Kim & Yejee Shin & Kyowon Kang & Kiho Kim & Gwanho Kim & Yunsu Byeon & Hwayeon Kim & Yuyan Gao & Jeong Ryong Lee & Geonhui Son & Taeseong Kim & Yohan Jun & Jihyun Kim & Jinyoung Lee & Seyun Um , 2022. "Ultrathin crystalline-silicon-based strain gauges with deep learning algorithms for silent speech interfaces," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    4. You Wang & Ming Zhang & Ruifen Hu & Guang Li & Nan Li, 2020. "Silent Speech Recognition for BCI - A Review," Biomedical Journal of Scientific & Technical Research, Biomedical Research Network+, LLC, vol. 27(2), pages 20625-20627, April.
    5. repec:prg:jnlelg:v:preprint:id:499 is not listed on IDEAS
    6. Suseendrakumar Duraivel & Shervin Rahimpour & Chia-Han Chiang & Michael Trumpis & Charles Wang & Katrina Barth & Stephen C. Harward & Shivanand P. Lad & Allan H. Friedman & Derek G. Southwell & Saurab, 2023. "High-resolution neural recordings improve the accuracy of speech decoding," Nature Communications, Nature, vol. 14(1), pages 1-16, December.
    7. Joshua Kosnoff & Kai Yu & Chang Liu & Bin He, 2024. "Transcranial focused ultrasound to V5 enhances human visual motion brain-computer interface by modulating feature-based attention," Nature Communications, Nature, vol. 15(1), pages 1-18, December.
    8. Xiao-yu Sun & Bin Ye, 2023. "The functional differentiation of brain–computer interfaces (BCIs) and its ethical implications," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-9, December.
    9. Robert Burgan, 2023. "Once More about Human Nature, Enhancement and Substitution [Ešte raz o ľudskej prirodzenosti, zdokonaľovaní a substitúcii]," E-LOGOS, Prague University of Economics and Business, vol. 2023(2), pages 4-55.
    10. Huixin Tan & Xiaoyu Zeng & Jun Ni & Kun Liang & Cuiping Xu & Yanyang Zhang & Jiaxin Wang & Zizhou Li & Jiaxin Yang & Chunlei Han & Yuan Gao & Xinguang Yu & Shihui Han & Fangang Meng & Yina Ma, 2024. "Intracranial EEG signals disentangle multi-areal neural dynamics of vicarious pain perception," Nature Communications, Nature, vol. 15(1), pages 1-18, December.
    11. Prateek Jain & Alberto Garcia Garcia, 2022. "Quantum classical hybrid neural networks for continuous variable prediction," Papers 2212.04209, arXiv.org, revised Mar 2023.
    12. Elisa Donati & Giacomo Valle, 2024. "Neuromorphic hardware for somatosensory neuroprostheses," Nature Communications, Nature, vol. 15(1), pages 1-18, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:nathum:v:8:y:2024:i:6:d:10.1038_s41562-024-01867-y. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.