Author
Listed:
- Paulo Armando Cavalcante Aguilar
(Electronic and Physics Department, Mines Télécom- Télécom SudParis, Evry, France)
- Jerome Boudy
(Electronic and Physics Department, Mines Télécom- Télécom SudParis, Evry, France)
- Dan Istrate
(École Supérieure d’Ingénieurs en Informatique et Génie des Télécommunications, Villejuif, France)
- Hamid Medjahed
(École Supérieure d’Ingénieurs en Informatique et Génie des Télécommunications, Villejuif, France)
- Bernadette Dorizzi
(Electronic and Physics Department, Mines Télécom- Télécom SudParis, Evry, France)
- João Cesar Moura Mota
(Federal University of Ceará, Benfica, Fortaleza, Brazil)
- Jean Louis Baldinger
(Electronic and Physics Department, Mines Télécom- Télécom SudParis, Evry, France)
- Toufik Guettari
(Electronic and Physics Department, Mines Télécom- Télécom SudParis, Evry, France)
- Imad Belfeki
(Electronic and Physics Department, Mines Télécom- Télécom SudParis, Evry, France)
Abstract
The multi-sensor fusion can provide more accurate and reliable information compared to information from each sensor separately taken. Moreover, the data from multiple heterogeneous sensors present in the medical surveillance systems have different degrees of uncertainty. Among multi-sensor data fusion techniques, Bayesian methods and Evidence theories such as Dempster-Shafer Theory (DST) are commonly used to handle the degree of uncertainty in the fusion processes. Based on a graphic representation of the DST called Evidential Networks, we propose a structure of heterogeneous multi-sensor fusion for falls detection. The proposed Evidential Network (EN) can handle the uncertainty present in a mobile and a fixed sensor-based remote monitoring systems (fall detection) by fusing them and therefore increasing the fall detection sensitivity compared to the a separated system alone.
Suggested Citation
Paulo Armando Cavalcante Aguilar & Jerome Boudy & Dan Istrate & Hamid Medjahed & Bernadette Dorizzi & João Cesar Moura Mota & Jean Louis Baldinger & Toufik Guettari & Imad Belfeki, 2013.
"Evidential Network-Based Multimodal Fusion for Fall Detection,"
International Journal of E-Health and Medical Communications (IJEHMC), IGI Global, vol. 4(1), pages 46-60, January.
Handle:
RePEc:igg:jehmc0:v:4:y:2013:i:1:p:46-60
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:igg:jehmc0:v:4:y:2013:i:1:p:46-60. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Journal Editor (email available below). General contact details of provider: https://www.igi-global.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.