Author
Listed:
- Jacqueline Kory Westlund
- Sidney K D’Mello
- Andrew M Olney
Abstract
Researchers in the cognitive and affective sciences investigate how thoughts and feelings are reflected in the bodily response systems including peripheral physiology, facial features, and body movements. One specific question along this line of research is how cognition and affect are manifested in the dynamics of general body movements. Progress in this area can be accelerated by inexpensive, non-intrusive, portable, scalable, and easy to calibrate movement tracking systems. Towards this end, this paper presents and validates Motion Tracker, a simple yet effective software program that uses established computer vision techniques to estimate the amount a person moves from a video of the person engaged in a task (available for download from http://jakory.com/motion-tracker/). The system works with any commercially available camera and with existing videos, thereby affording inexpensive, non-intrusive, and potentially portable and scalable estimation of body movement. Strong between-subject correlations were obtained between Motion Tracker’s estimates of movement and body movements recorded from the seat (r =.720) and back (r = .695 for participants with higher back movement) of a chair affixed with pressure-sensors while completing a 32-minute computerized task (Study 1). Within-subject cross-correlations were also strong for both the seat (r =.606) and back (r = .507). In Study 2, between-subject correlations between Motion Tracker’s movement estimates and movements recorded from an accelerometer worn on the wrist were also strong (rs = .801, .679, and .681) while people performed three brief actions (e.g., waving). Finally, in Study 3 the within-subject cross-correlation was high (r = .855) when Motion Tracker’s estimates were correlated with the movement of a person’s head as tracked with a Kinect while the person was seated at a desk (Study 3). Best-practice recommendations, limitations, and planned extensions of the system are discussed.
Suggested Citation
Jacqueline Kory Westlund & Sidney K D’Mello & Andrew M Olney, 2015.
"Motion Tracker: Camera-Based Monitoring of Bodily Movements Using Motion Silhouettes,"
PLOS ONE, Public Library of Science, vol. 10(6), pages 1-27, June.
Handle:
RePEc:plo:pone00:0130293
DOI: 10.1371/journal.pone.0130293
Download full text from publisher
Citations
Citations are extracted by the
CitEc Project, subscribe to its
RSS feed for this item.
Cited by:
- Ting Tao & Ryota Sato & Yusuke Matsuda & Jumpei Takata & Fijun Kim & Yukio Daikubara & Koji Fujita & Kotaro Hanamoto & Fumio Kinoshita & Ricki Colman & Mamiko Koshiba, 2020.
"Elderly Body Movement Alteration at 2nd Experience of Digital Art Installation with Cognitive and Motivation Scores,"
J, MDPI, vol. 3(2), pages 1-13, April.
- Bahaa Mustafa, 2022.
"Using 3D Animation and Virtual Reality in Educations,"
Technium Social Sciences Journal, Technium Science, vol. 27(1), pages 269-289, January.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0130293. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.