Author
Listed:
- Chi-Tung Cheng
(Chang Gung University)
- Yirui Wang
(PAII Inc)
- Huan-Wu Chen
(Chang Gung University College of Medicine)
- Po-Meng Hsiao
(New Taipei Municipal TuCheng Hospital)
- Chun-Nan Yeh
(Chang Gung University)
- Chi-Hsun Hsieh
(Chang Gung University)
- Shun Miao
(PAII Inc)
- Jing Xiao
(PAII Inc)
- Chien-Hung Liao
(Chang Gung University
Center for Artificial Intelligence in Medicine, Chang Gung Memorial hospital, Linkou)
- Le Lu
(PAII Inc)
Abstract
Pelvic radiograph (PXR) is essential for detecting proximal femur and pelvis injuries in trauma patients, which is also the key component for trauma survey. None of the currently available algorithms can accurately detect all kinds of trauma-related radiographic findings on PXRs. Here, we show a universal algorithm can detect most types of trauma-related radiographic findings on PXRs. We develop a multiscale deep learning algorithm called PelviXNet trained with 5204 PXRs with weakly supervised point annotation. PelviXNet yields an area under the receiver operating characteristic curve (AUROC) of 0.973 (95% CI, 0.960–0.983) and an area under the precision-recall curve (AUPRC) of 0.963 (95% CI, 0.948–0.974) in the clinical population test set of 1888 PXRs. The accuracy, sensitivity, and specificity at the cutoff value are 0.924 (95% CI, 0.912–0.936), 0.908 (95% CI, 0.885–0.908), and 0.932 (95% CI, 0.919–0.946), respectively. PelviXNet demonstrates comparable performance with radiologists and orthopedics in detecting pelvic and hip fractures.
Suggested Citation
Chi-Tung Cheng & Yirui Wang & Huan-Wu Chen & Po-Meng Hsiao & Chun-Nan Yeh & Chi-Hsun Hsieh & Shun Miao & Jing Xiao & Chien-Hung Liao & Le Lu, 2021.
"A scalable physician-level deep learning algorithm detects universal trauma on pelvic radiographs,"
Nature Communications, Nature, vol. 12(1), pages 1-10, December.
Handle:
RePEc:nat:natcom:v:12:y:2021:i:1:d:10.1038_s41467-021-21311-3
DOI: 10.1038/s41467-021-21311-3
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:12:y:2021:i:1:d:10.1038_s41467-021-21311-3. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.