Author
Listed:
- Filipe Gaspar
(ADETTI-IUL / ISCTE-Lisbon University Institute, Portugal)
- Rafael Bastos
(Vision-Box & ADETTI-IUL / ISCTE-Lisbon University Institute, Portugal)
- Miguel Sales
(DiasMicrosoft Language Development Center & ISCTE-Lisbon University Institute, Portugal)
Abstract
In large-scale immersive virtual reality (VR) environments, such as a CAVE, one of the most common problems is tracking the position of the user’s head while he or she is immersed in this environment to reflect perspective changes in the synthetic stereoscopic images. In this paper, the authors describe the theoretical foundations and engineering approach adopted in the development of an infrared-optical tracking system designed for large scale immersive Virtual Environments (VE) or Augmented Reality (AR) settings. The system is capable of tracking independent retro-reflective markers arranged in a 3D structure in real time, recovering all possible 6DOF. These artefacts can be adjusted to the user’s stereo glasses to track his or her head while immersed or used as a 3D input device for rich human-computer interaction (HCI). The hardware configuration consists of 4 shutter-synchronized cameras attached with band-pass infrared filters and illuminated by infrared array-emitters. Pilot lab results have shown a latency of 40 ms when simultaneously tracking the pose of two artefacts with 4 infrared markers, achieving a frame-rate of 24.80 fps and showing a mean accuracy of 0.93mm/0.51º and a mean precision of 0.19mm/0.04º, respectively, in overall translation/rotation, fulfilling the requirements initially defined.
Suggested Citation
Filipe Gaspar & Rafael Bastos & Miguel Sales, 2011.
"Accurate Infrared Tracking System for Immersive Virtual Environments,"
International Journal of Creative Interfaces and Computer Graphics (IJCICG), IGI Global, vol. 2(2), pages 49-73, July.
Handle:
RePEc:igg:jcicg0:v:2:y:2011:i:2:p:49-73
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:igg:jcicg0:v:2:y:2011:i:2:p:49-73. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Journal Editor (email available below). General contact details of provider: https://www.igi-global.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.