IDEAS home Printed from https://ideas.repec.org/a/hin/complx/6862031.html
   My bibliography  Save this article

Decoding Attentional State to Faces and Scenes Using EEG Brainwaves

Author

Listed:
  • Reza Abiri
  • Soheil Borhani
  • Yang Jiang
  • Xiaopeng Zhao

Abstract

Attention is the ability to facilitate processing perceptually salient information while blocking the irrelevant information to an ongoing task. For example, visual attention is a complex phenomenon of searching for a target while filtering out competing stimuli. In the present study, we developed a new Brain-Computer Interface (BCI) platform to decode brainwave patterns during sustained attention in a participant. Scalp electroencephalography (EEG) signals using a wireless headset were collected in real time during a visual attention task. In our experimental protocol, we primed participants to discriminate a sequence of composite images. Each image was a fair superimposition of a scene and a face image. The participants were asked to respond to the intended subcategory (e.g., indoor scenes) while withholding their responses for the irrelevant subcategories (e.g., outdoor scenes). We developed an individualized model using machine learning techniques to decode attentional state of the participant based on their brainwaves. Our model revealed the instantaneous attention towards face and scene categories. We conducted the experiment with six volunteer participants. The average decoding accuracy of our model was about 77%, which was comparable with a former study using functional magnetic resonance imaging (fMRI). The present work was an attempt to reveal momentary level of sustained attention using EEG signals. The platform may have potential applications in visual attention evaluation and closed-loop brainwave regulation in future.

Suggested Citation

  • Reza Abiri & Soheil Borhani & Yang Jiang & Xiaopeng Zhao, 2019. "Decoding Attentional State to Faces and Scenes Using EEG Brainwaves," Complexity, Hindawi, vol. 2019, pages 1-10, February.
  • Handle: RePEc:hin:complx:6862031
    DOI: 10.1155/2019/6862031
    as

    Download full text from publisher

    File URL: http://downloads.hindawi.com/journals/8503/2019/6862031.pdf
    Download Restriction: no

    File URL: http://downloads.hindawi.com/journals/8503/2019/6862031.xml
    Download Restriction: no

    File URL: https://libkey.io/10.1155/2019/6862031?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hin:complx:6862031. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Mohamed Abdelhakeem (email available below). General contact details of provider: https://www.hindawi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.