Author
Listed:
- Xuan Xiao
(Tsinghua University
Tsinghua University)
- Jianjian Wang
(Tsinghua University
Tsinghua University)
- Pingfa Feng
(Tsinghua University
Tsinghua University)
- Ao Gong
(Tsinghua University
Tsinghua University)
- Xiangyu Zhang
(Tsinghua University
Tsinghua University)
- Jianfu Zhang
(Tsinghua University
Tsinghua University)
Abstract
Inertial Measurement Unit-based methods have great potential in capturing motion in large-scale and complex environments with many people. Sparse Inertial Measurement Unit-based methods have more research value due to their simplicity and flexibility. However, improving the computational efficiency and reducing latency in such methods are challenging. In this paper, we propose Fast Inertial Poser, which is a full body motion estimation deep neural network based on 6 inertial measurement units considering body parameters. We design a network architecture based on recurrent neural networks according to the kinematics tree. This method introduces human body shape information by the causality of observations and eliminates the dependence on future frames. During the estimation of joint positions, the upper body and lower body are estimated using separate network modules independently. Then the joint rotation is obtained through a well-designed single-frame kinematics inverse solver. Experiments show that the method can greatly improve the inference speed and reduce the latency while ensuring the reconstruction accuracy compared with previous methods. Fast Inertial Poser runs at 65 fps with 15 ms latency on an embedded computer, demonstrating the efficiency of the model.
Suggested Citation
Xuan Xiao & Jianjian Wang & Pingfa Feng & Ao Gong & Xiangyu Zhang & Jianfu Zhang, 2024.
"Fast Human Motion reconstruction from sparse inertial measurement units considering the human shape,"
Nature Communications, Nature, vol. 15(1), pages 1-11, December.
Handle:
RePEc:nat:natcom:v:15:y:2024:i:1:d:10.1038_s41467-024-46662-5
DOI: 10.1038/s41467-024-46662-5
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:15:y:2024:i:1:d:10.1038_s41467-024-46662-5. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.