IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2212.04209.html
   My bibliography  Save this paper

Quantum classical hybrid neural networks for continuous variable prediction

Author

Listed:
  • Prateek Jain
  • Alberto Garcia Garcia

Abstract

Within this decade, quantum computers are predicted to outperform conventional computers in terms of processing power and have a disruptive effect on a variety of business sectors. It is predicted that the financial sector would be one of the first to benefit from quantum computing both in the short and long terms. In this research work we use Hybrid Quantum Neural networks to present a quantum machine learning approach for Continuous variable prediction.

Suggested Citation

  • Prateek Jain & Alberto Garcia Garcia, 2022. "Quantum classical hybrid neural networks for continuous variable prediction," Papers 2212.04209, arXiv.org, revised Mar 2023.
  • Handle: RePEc:arx:papers:2212.04209
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2212.04209
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Gopala K. Anumanchipalli & Josh Chartier & Edward F. Chang, 2019. "Speech synthesis from neural decoding of spoken sentences," Nature, Nature, vol. 568(7753), pages 493-498, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. You Wang & Ming Zhang & Ruifen Hu & Guang Li & Nan Li, 2020. "Silent Speech Recognition for BCI - A Review," Biomedical Journal of Scientific & Technical Research, Biomedical Research Network+, LLC, vol. 27(2), pages 20625-20627, April.
    2. Junfeng Lu & Yuanning Li & Zehao Zhao & Yan Liu & Yanming Zhu & Ying Mao & Jinsong Wu & Edward F. Chang, 2023. "Neural control of lexical tone production in human laryngeal motor cortex," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    3. Suseendrakumar Duraivel & Shervin Rahimpour & Chia-Han Chiang & Michael Trumpis & Charles Wang & Katrina Barth & Stephen C. Harward & Shivanand P. Lad & Allan H. Friedman & Derek G. Southwell & Saurab, 2023. "High-resolution neural recordings improve the accuracy of speech decoding," Nature Communications, Nature, vol. 14(1), pages 1-16, December.
    4. Joshua Kosnoff & Kai Yu & Chang Liu & Bin He, 2024. "Transcranial focused ultrasound to V5 enhances human visual motion brain-computer interface by modulating feature-based attention," Nature Communications, Nature, vol. 15(1), pages 1-18, December.
    5. Xiao-yu Sun & Bin Ye, 2023. "The functional differentiation of brain–computer interfaces (BCIs) and its ethical implications," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-9, December.
    6. Sean L. Metzger & Jessie R. Liu & David A. Moses & Maximilian E. Dougherty & Margaret P. Seaton & Kaylo T. Littlejohn & Josh Chartier & Gopala K. Anumanchipalli & Adelyn Tu-Chan & Karunesh Ganguly & E, 2022. "Generalizable spelling using a speech neuroprosthesis in an individual with severe limb and vocal paralysis," Nature Communications, Nature, vol. 13(1), pages 1-15, December.
    7. Taemin Kim & Yejee Shin & Kyowon Kang & Kiho Kim & Gwanho Kim & Yunsu Byeon & Hwayeon Kim & Yuyan Gao & Jeong Ryong Lee & Geonhui Son & Taeseong Kim & Yohan Jun & Jihyun Kim & Jinyoung Lee & Seyun Um , 2022. "Ultrathin crystalline-silicon-based strain gauges with deep learning algorithms for silent speech interfaces," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    8. Sarah K. Wandelt & David A. Bjånes & Kelsie Pejsa & Brian Lee & Charles Liu & Richard A. Andersen, 2024. "Representation of internal speech by single neurons in human supramarginal gyrus," Nature Human Behaviour, Nature, vol. 8(6), pages 1136-1149, June.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2212.04209. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.