Author
Listed:
- Junzhao Du
- Wenjing Chen
- Yuewei Liu
- Yawen Gu
- Hui Liu
Abstract
Localization, together with scene and human activity sensing, provides primitive and essential information for upper layer mobile applications. In this paper, we present a novel indoor localization system. We not only make rough localization by the use of the Wi-Fi/GSM signal, but also use the microphone of the smartphone to deeply sense the environment. By analyzing the ambient sound and speech after voice activity detection, we can know exactly if the user appears somewhere and does something at the regulated time according to his or her schedule. The ambient noise is used to identify the ambient scene, and we can deduce the users' current activities by the user's speech sensing. Conclusively, accurate localization and the status information on the user are made by synthesizing the above sound sensing information. And, according to the prestored schedule in the backend server, the location and sensing information are returned to the monitor who wants to know where the user is and what he is doing. We prototype the system on Android mobile phones and evaluate the system comprehensively with data collected from 61 different indoor sites by 100 volunteers over a two-month period of experiments by employing different phone models. We believe this is a novel approach to indoor localization, holding promise of real-world deployment.
Suggested Citation
Junzhao Du & Wenjing Chen & Yuewei Liu & Yawen Gu & Hui Liu, 2013.
"Catch You as I Can: Indoor Localization via Ambient Sound Signature and Human Behavior,"
International Journal of Distributed Sensor Networks, , vol. 9(11), pages 434301-4343, November.
Handle:
RePEc:sae:intdis:v:9:y:2013:i:11:p:434301
DOI: 10.1155/2013/434301
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:intdis:v:9:y:2013:i:11:p:434301. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.