IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v13y2025i7p1072-d1620345.html
   My bibliography  Save this article

Frequency-Domain Hybrid Model for EEG-Based Emotion Recognition

Author

Listed:
  • Jinyu Liu

    (College of Computer Science and Engineering, Shandong University of Science and Technology, Qingdao 266590, China)

  • Naidan Feng

    (College of Computer Science and Engineering, Shandong University of Science and Technology, Qingdao 266590, China)

  • Yongquan Liang

    (College of Computer Science and Engineering, Shandong University of Science and Technology, Qingdao 266590, China)

Abstract

Emotion recognition based on Electroencephalogram (EEG) signals plays a vital role in affective computing and human–computer interaction (HCI). However, noise, artifacts, and signal distortions present challenges that limit classification accuracy and robustness. To address these issues, we propose ECA-ResDNN, a novel hybrid model designed to leverage the frequency, spatial, and temporal characteristics of EEG signals for improved emotion recognition. Unlike conventional models, ECA-ResDNN integrates an Efficient Channel Attention (ECA) mechanism within a residual neural network to enhance feature selection in the frequency domain while preserving essential spatial information. A Deep Neural Network further extracts temporal dependencies, improving classification precision. Additionally, a hybrid loss function that combines cross-entropy loss and fuzzy set loss enhances the model’s robustness to noise and uncertainty. Experimental results demonstrate that ECA-ResDNN significantly outperforms existing approaches in both accuracy and robustness, underscoring its potential for applications in affective computing, mental health monitoring, and intelligent human–computer interaction.

Suggested Citation

  • Jinyu Liu & Naidan Feng & Yongquan Liang, 2025. "Frequency-Domain Hybrid Model for EEG-Based Emotion Recognition," Mathematics, MDPI, vol. 13(7), pages 1-18, March.
  • Handle: RePEc:gam:jmathe:v:13:y:2025:i:7:p:1072-:d:1620345
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/13/7/1072/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/13/7/1072/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:13:y:2025:i:7:p:1072-:d:1620345. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.