Author
Listed:
- Y. J. Qi
- Y. P. Kong
- Q. Zhang
- Long Wang
Abstract
Gait recognition is a powerful tool for long-distance identification. However, gaits are influenced by walking environments and appearance changes. Therefore, the gait recognition rate declines sharply when the viewing angle changes. In this work, we propose a novel cross-view gait recognition method with two-way similarity learning. Focusing on the relationships between gait elements in three-dimensional space and the wholeness of human body movements, we design a three-dimensional gait constraint model that is robust to view changes based on joint motion constraint relationships. Different from the classic three-dimensional model, the proposed model characterizes motion constraints and action constraints between joints based on time and space dimensions. Next, we propose an end-to-end two-way gait network using long short-term memory and residual network 50 to extract the temporal and spatial difference features, respectively, of model pairs. The two types of difference features are merged at a high level in the network, and similarity values are obtained through the softmax layer. Our method is evaluated based on the challenging CASIA-B data set in terms of cross-view gait recognition. The experimental results show that the method achieves a higher recognition rate than the previously developed model-based methods. The recognition rate reaches 72.8%, and the viewing angle changes from 36° to 144° for normal walking. Finally, the new method also performs better in cases with large cross-view angles, illustrating that our model is robust to viewing angle changes and that the proposed network offers considerable potential in practical application scenarios.
Suggested Citation
Y. J. Qi & Y. P. Kong & Q. Zhang & Long Wang, 2022.
"A Cross-View Gait Recognition Method Using Two-Way Similarity Learning,"
Mathematical Problems in Engineering, Hindawi, vol. 2022, pages 1-14, May.
Handle:
RePEc:hin:jnlmpe:2674425
DOI: 10.1155/2022/2674425
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hin:jnlmpe:2674425. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Mohamed Abdelhakeem (email available below). General contact details of provider: https://www.hindawi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.