Author
Listed:
- Jangwoon Park
(Department of Engineering, Texas A&M University-Corpus Christi, Corpus Christi, TX 78412, USA)
- Sinae Lee
(Department of English, Texas A&M University-Corpus Christi, Corpus Christi, TX 78412, USA)
- Kimberly Brotherton
(Department of Engineering, Texas A&M University-Corpus Christi, Corpus Christi, TX 78412, USA)
- Dugan Um
(Department of Engineering, Texas A&M University-Corpus Christi, Corpus Christi, TX 78412, USA)
- Jaehyun Park
(Department of Industrial & Management Engineering, Incheon National University, Incheon 22012, Korea)
Abstract
According to the similarity-attraction theory, humans respond more positively to people who are similar in personality. This observation also holds true between humans and robots, as shown by recent studies that examined human-robot interactions. Thus, it would be conducive for robots to be able to capture the user personality and adjust the interactional patterns accordingly. The present study is intended to identify significant speech characteristics such as sound and lexical features between the two different personality groups (introverts vs. extroverts), so that a robot can distinguish a user’s personality by observing specific speech characteristics. Twenty-four male participants took the Myers-Briggs Type Indicator (MBTI) test for personality screening. The speech data of those participants (identified as 12 introvertive males and 12 extroversive males through the MBTI test) were recorded while they were verbally responding to the eight Walk-in-the-Wood questions. After that, speech, sound, and lexical features were extracted. Averaged reaction time (1.200 s for introversive and 0.762 s for extroversive; p = 0.01) and total reaction time (9.39 s for introversive and 6.10 s for extroversive; p = 0.008) showed significant differences between the two groups. However, averaged pitch frequency, sound power, and lexical features did not show significant differences between the two groups. A binary logistic regression developed to classify two different personalities showed 70.8% of classification accuracy. Significant speech features between introversive and extroversive individuals have been identified, and a personality classification model has been developed. The identified features would be applicable for designing or programming a social robot to promote human-robot interaction by matching the robot’s behaviors toward a user’s personality estimated.
Suggested Citation
Jangwoon Park & Sinae Lee & Kimberly Brotherton & Dugan Um & Jaehyun Park, 2020.
"Identification of Speech Characteristics to Distinguish Human Personality of Introversive and Extroversive Male Groups,"
IJERPH, MDPI, vol. 17(6), pages 1-12, March.
Handle:
RePEc:gam:jijerp:v:17:y:2020:i:6:p:2125-:d:335834
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jijerp:v:17:y:2020:i:6:p:2125-:d:335834. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.