IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v12y2021i1d10.1038_s41467-021-25637-w.html
   My bibliography  Save this article

AI enabled sign language recognition and VR space bidirectional communication using triboelectric smart glove

Author

Listed:
  • Feng Wen

    (National University of Singapore
    National University of Singapore Suzhou Research Institute (NUSRI)
    National University of Singapore)

  • Zixuan Zhang

    (National University of Singapore
    National University of Singapore Suzhou Research Institute (NUSRI)
    National University of Singapore)

  • Tianyiyi He

    (National University of Singapore
    National University of Singapore Suzhou Research Institute (NUSRI)
    National University of Singapore)

  • Chengkuo Lee

    (National University of Singapore
    National University of Singapore Suzhou Research Institute (NUSRI)
    National University of Singapore
    National University of Singapore)

Abstract

Sign language recognition, especially the sentence recognition, is of great significance for lowering the communication barrier between the hearing/speech impaired and the non-signers. The general glove solutions, which are employed to detect motions of our dexterous hands, only achieve recognizing discrete single gestures (i.e., numbers, letters, or words) instead of sentences, far from satisfying the meet of the signers’ daily communication. Here, we propose an artificial intelligence enabled sign language recognition and communication system comprising sensing gloves, deep learning block, and virtual reality interface. Non-segmentation and segmentation assisted deep learning model achieves the recognition of 50 words and 20 sentences. Significantly, the segmentation approach splits entire sentence signals into word units. Then the deep learning model recognizes all word elements and reversely reconstructs and recognizes sentences. Furthermore, new/never-seen sentences created by new-order word elements recombination can be recognized with an average correct rate of 86.67%. Finally, the sign language recognition results are projected into virtual space and translated into text and audio, allowing the remote and bidirectional communication between signers and non-signers.

Suggested Citation

  • Feng Wen & Zixuan Zhang & Tianyiyi He & Chengkuo Lee, 2021. "AI enabled sign language recognition and VR space bidirectional communication using triboelectric smart glove," Nature Communications, Nature, vol. 12(1), pages 1-13, December.
  • Handle: RePEc:nat:natcom:v:12:y:2021:i:1:d:10.1038_s41467-021-25637-w
    DOI: 10.1038/s41467-021-25637-w
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-021-25637-w
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-021-25637-w?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Min Chen & Jingyu Ouyang & Aijia Jian & Jia Liu & Pan Li & Yixue Hao & Yuchen Gong & Jiayu Hu & Jing Zhou & Rui Wang & Jiaxi Wang & Long Hu & Yuwei Wang & Ju Ouyang & Jing Zhang & Chong Hou & Lei Wei , 2022. "Imperceptible, designable, and scalable braided electronic cord," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    2. Taemin Kim & Yejee Shin & Kyowon Kang & Kiho Kim & Gwanho Kim & Yunsu Byeon & Hwayeon Kim & Yuyan Gao & Jeong Ryong Lee & Geonhui Son & Taeseong Kim & Yohan Jun & Jihyun Kim & Jinyoung Lee & Seyun Um , 2022. "Ultrathin crystalline-silicon-based strain gauges with deep learning algorithms for silent speech interfaces," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    3. Yijia Lu & Han Tian & Jia Cheng & Fei Zhu & Bin Liu & Shanshan Wei & Linhong Ji & Zhong Lin Wang, 2022. "Decoding lip language using triboelectric sensors with deep learning," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    4. Zhongda Sun & Minglu Zhu & Xuechuan Shan & Chengkuo Lee, 2022. "Augmented tactile-perception and haptic-feedback rings as human-machine interfaces aiming for immersive interactions," Nature Communications, Nature, vol. 13(1), pages 1-13, December.
    5. Zhao, Lin-Chuan & Zhou, Teng & Chang, Si-Deng & Zou, Hong-Xiang & Gao, Qiu-Hua & Wu, Zhi-Yuan & Yan, Ge & Wei, Ke-Xiang & Yeatman, Eric M. & Meng, Guang & Zhang, Wen-Ming, 2024. "A disposable cup inspired smart floor for trajectory recognition and human-interactive sensing," Applied Energy, Elsevier, vol. 357(C).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:12:y:2021:i:1:d:10.1038_s41467-021-25637-w. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.