IDEAS home Printed from https://ideas.repec.org/h/spr/lnopch/978-3-031-15644-1_11.html
   My bibliography  Save this book chapter

Group-Level Human Affect Recognition with Multiple Graph Kernel Fusion

In: City, Society, and Digital Transformation

Author

Listed:
  • Xiaohua Huang

    (Nanjing Institute of Technology
    Jiangsu Province Engineering Research Center)

Abstract

Research on group-level affect recognition has become emerging for predicting human behavior in a group. Due to the variability in group size, a critical issue should be addressed. That is, how to efficiently and effectively describe the affect similarity of two group-level images. To tackle this problem, this paper makes two-fold contributions: (1) similarity measurement of group affect based on graph kernel; (2) incorporation of multiple kernels and deep learning architecture to contribute to recognizing group-level affect. In this paper, we view a group as a graph. Thus, we first formulated the graph of the group, and then build the kernel between any two graphs. Its advantage is to efficiently use any kernel classifier. To further exploit the advantages of multiple kernels, we used two feature descriptors to extract the face features. Subsequently, we proposed straightforwardly using deep multiple kernel learning with three-layer. To resolve non-differential problem, we presented the graph kernel as the input of four kinds of base kernels and learned their corresponding weights. Intensive experiments are performed on a challenging group-level affective database. Performance comparisons considerably demonstrate the advantages of graph kernel and deep multiple kernel learning. Additionally, our proposed approach obtains promising performance for group-level affect recognition compared with the recent state-of-the-art methods.

Suggested Citation

  • Xiaohua Huang, 2022. "Group-Level Human Affect Recognition with Multiple Graph Kernel Fusion," Lecture Notes in Operations Research, in: Robin Qiu & Wai Kin Victor Chan & Weiwei Chen & Youakim Badr & Canrong Zhang (ed.), City, Society, and Digital Transformation, chapter 0, pages 127-140, Springer.
  • Handle: RePEc:spr:lnopch:978-3-031-15644-1_11
    DOI: 10.1007/978-3-031-15644-1_11
    as

    Download full text from publisher

    To our knowledge, this item is not available for download. To find whether it is available, there are three options:
    1. Check below whether another version of this item is available online.
    2. Check on the provider's web page whether it is in fact available.
    3. Perform a search for a similarly titled item that would be available.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:lnopch:978-3-031-15644-1_11. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.