Author
Listed:
- Nadeesha M. Gunaratne
(School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia)
- Claudia Gonzalez Viejo
(School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia)
- Thejani M. Gunaratne
(School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia)
- Damir D. Torrico
(School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia)
- Hollis Ashman
(School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia)
- Frank R. Dunshea
(School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia)
- Sigfredo Fuentes
(School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia)
Abstract
Study of emotions has gained interest in the field of sensory and consumer research. Accurate information can be obtained by studying physiological behavior along with self-reported-responses. The aim was to identify physiological and self-reported-responses towards visual stimuli and predict self-reported-responses using biometrics. Panelists (N = 63) were exposed to 12 images (ten from Geneva Affective PicturE Database (GAPED), two based on common fears) and a questionnaire (Face scale and EsSense). Emotions from facial expressions (FaceReader TM ), heart rate (HR), systolic pressure (SP), diastolic pressure (DP), and skin temperature (ST) were analyzed. Multiple regression analysis was used to predict self-reported-responses based on biometrics. Results showed that physiological along with self-reported responses were able to separate images based on cluster analysis as positive, neutral, or negative according to GAPED classification. Emotional terms with high or low valence were predicted by a general linear regression model using biometrics, while calm, which is in the center of emotion dimensional model, was not predicted. After separating images, positive and neutral categories could predict all emotional terms, while negative predicted Happy, Sad, and Scared. Heart Rate predicted emotions in positive (R 2 = 0.52 for Scared) and neutral (R 2 = 0.55 for Sad) categories while ST in positive images (R 2 = 0.55 for Sad, R 2 = 0.45 for Calm).
Suggested Citation
Nadeesha M. Gunaratne & Claudia Gonzalez Viejo & Thejani M. Gunaratne & Damir D. Torrico & Hollis Ashman & Frank R. Dunshea & Sigfredo Fuentes, 2019.
"Effects of Imagery as Visual Stimuli on the Physiological and Emotional Responses,"
J, MDPI, vol. 2(2), pages 1-20, June.
Handle:
RePEc:gam:jjopen:v:2:y:2019:i:2:p:15-225:d:239291
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jjopen:v:2:y:2019:i:2:p:15-225:d:239291. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.