Author
Listed:
- Farah Hazem
- Bennour Akram
- Tahar Mekhaznia
- Fahad Ghabban
- Abdullah Alsaeedi
- Bhawna Goyal
Abstract
Person identification through chest X-ray radiographs stands as a vanguard in both healthcare and biometrical security domains. In contrast to traditional biometric modalities, such as facial recognition, fingerprints and iris scans, the research orientation towards chest X-ray recognition has been spurred by its remarkable recognition rates. Capturing the intricate anatomical nuances of an individual's rib cage, lungs and heart, chest X-ray images emerge as a focal point for identification, even in scenarios where the human body is entirely damaged. Concerning the field of deep learning, a paradigm is exemplified in contemporary generations, with promising outcomes in classification and image similarity challenges. However, the training of convolutional neural networks (CNNs) requires copious labelled data and is time-consuming. In this study, we delve into the rich repository of the NIH ChestX-ray14 dataset, comprising 112,120 frontal-view chest radiographs from 30,805 unique patients. Our methodology is nuanced, employing the potency of Siamese neural networks and the triplet loss in conjunction with refined CNN models for feature extraction. The Siamese networks facilitate robust image similarity comparison, while the triplet loss optimizes the embedding space, mitigating intra-class variations and amplifying inter-class distances. A meticulous examination of our experimental results reveals profound insights into our model performance. Noteworthy is the remarkable accuracy achieved by the VGG-19 model, standing at an impressive 97%. This achievement is underpinned by a well-balanced precision of 95.3% and an outstanding recall of 98.4%. Surpassing other CNN models utilized in our research and outshining existing state-of-the-art models, our approach establishes itself as a vanguard in the pursuit of person identification through chest X-ray images.
Suggested Citation
Farah Hazem & Bennour Akram & Tahar Mekhaznia & Fahad Ghabban & Abdullah Alsaeedi & Bhawna Goyal, 2024.
"Beyond Traditional Biometrics: Harnessing Chest X-Ray Features for Robust Person Identification,"
Acta Informatica Pragensia, Prague University of Economics and Business, vol. 2024(2), pages 234-250.
Handle:
RePEc:prg:jnlaip:v:2024:y:2024:i:2:id:238:p:234-250
DOI: 10.18267/j.aip.238
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:prg:jnlaip:v:2024:y:2024:i:2:id:238:p:234-250. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Stanislav Vojir (email available below). General contact details of provider: https://edirc.repec.org/data/uevsecz.html .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.