Author
Abstract
Emotion plays a significant role in interpersonal communication and also improving social life. In recent years, facial emotion recognition is highly adopted in developing human-computer interfaces (HCI) and humanoid robots. In this work, a triangulation method for extracting a novel set of geometric features is proposed to classify six emotional expressions (sadness, anger, fear, surprise, disgust, and happiness) using computer-generated markers. The subject’s face is recognized by using Haar-like features. A mathematical model has been applied to positions of eight virtual markers in a defined location on the subject’s face in an automated way. Five triangles are formed by manipulating eight markers’ positions as an edge of each triangle. Later, these eight markers are uninterruptedly tracked by Lucas- Kanade optical flow algorithm while subjects’ articulating facial expressions. The movement of the markers during facial expression directly changes the property of each triangle. The area of the triangle (AoT), Inscribed circle circumference (ICC), and the Inscribed circle area of a triangle (ICAT) are extracted as features to classify the facial emotions. These features are used to distinguish six different facial emotions using various types of machine learning algorithms. The inscribed circle area of the triangle (ICAT) feature gives a maximum mean classification rate of 98.17% using a Random Forest (RF) classifier compared to other features and classifiers in distinguishing emotional expressions.
Suggested Citation
Murugappan M. & Mutawa A., 2021.
"Facial geometric feature extraction based emotional expression classification using machine learning algorithms,"
PLOS ONE, Public Library of Science, vol. 16(2), pages 1-20, February.
Handle:
RePEc:plo:pone00:0247131
DOI: 10.1371/journal.pone.0247131
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0247131. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.