Author
Listed:
- Qiang Fu
- Xingui Zhang
- Jinxiu Xu
- Haimin Zhang
Abstract
Motion pose capture technology can effectively solve the problem of difficulty in defining character motion in the process of 3D animation production and greatly reduce the workload of character motion control, thereby improving the efficiency of animation development and the fidelity of character motion. Motion gesture capture technology is widely used in virtual reality systems, virtual training grounds, and real-time tracking of the motion trajectories of general objects. This paper proposes an attitude estimation algorithm adapted to be embedded. The previous centralized Kalman filter is divided into two-step Kalman filtering. According to the different characteristics of the sensors, they are processed separately to isolate the cross-influence between sensors. An adaptive adjustment method based on fuzzy logic is proposed. The acceleration, angular velocity, and geomagnetic field strength of the environment are used as the input of fuzzy logic to judge the motion state of the carrier and then adjust the covariance matrix of the filter. The adaptive adjustment of the sensor is converted to the recognition of the motion state. For the study of human motion posture capture, this paper designs a verification experiment based on the existing robotic arm in the laboratory. The experiment shows that the studied motion posture capture method has better performance. The human body motion gesture is designed for capturing experiments, and the capture results show that the obtained pose angle information can better restore the human body motion. A visual model of human motion posture capture was established, and after comparing and analyzing with the real situation, it was found that the simulation approach reproduced the motion process of human motion well. For the research of human motion recognition, this paper designs a two-classification model and human daily behaviors for experiments. Experiments show that the accuracy of the two-category human motion gesture capture and recognition has achieved good results. The experimental effect of SVC on the recognition of two classifications is excellent. In the case of using all optimization algorithms, the accuracy rate is higher than 90%, and the final recognition accuracy rate is also higher than 90%. In terms of recognition time, the time required for human motion gesture capture and recognition is less than 2 s.
Suggested Citation
Qiang Fu & Xingui Zhang & Jinxiu Xu & Haimin Zhang, 2020.
"Capture of 3D Human Motion Pose in Virtual Reality Based on Video Recognition,"
Complexity, Hindawi, vol. 2020, pages 1-17, November.
Handle:
RePEc:hin:complx:8857748
DOI: 10.1155/2020/8857748
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hin:complx:8857748. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Mohamed Abdelhakeem (email available below). General contact details of provider: https://www.hindawi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.