Author
Listed:
- Cheng Jin
(Stanford University School of Medicine)
- Heng Yu
(Stanford University School of Medicine)
- Jia Ke
(Sun Yat-sen University
Guangdong Institute of Gastroenterology, Guangdong Provincial Key Laboratory of Colorectal and Pelvic Floor Diseases)
- Peirong Ding
(Sun Yat-sen University Cancer Center
Sun Yat-sen University Cancer Center, State Key Laboratory of Oncology in South China, Collaborative Innovation Center for Cancer Medicine)
- Yongju Yi
(Sun Yat-sen University)
- Xiaofeng Jiang
(Sun Yat-sen University
Guangdong Institute of Gastroenterology, Guangdong Provincial Key Laboratory of Colorectal and Pelvic Floor Diseases)
- Xin Duan
(Sun Yat-sen University
Guangdong Institute of Gastroenterology, Guangdong Provincial Key Laboratory of Colorectal and Pelvic Floor Diseases)
- Jinghua Tang
(Sun Yat-sen University Cancer Center
Sun Yat-sen University Cancer Center, State Key Laboratory of Oncology in South China, Collaborative Innovation Center for Cancer Medicine)
- Daniel T. Chang
(Stanford University School of Medicine)
- Xiaojian Wu
(Sun Yat-sen University
Guangdong Institute of Gastroenterology, Guangdong Provincial Key Laboratory of Colorectal and Pelvic Floor Diseases)
- Feng Gao
(Sun Yat-sen University
Guangdong Institute of Gastroenterology, Guangdong Provincial Key Laboratory of Colorectal and Pelvic Floor Diseases)
- Ruijiang Li
(Stanford University School of Medicine)
Abstract
Radiographic imaging is routinely used to evaluate treatment response in solid tumors. Current imaging response metrics do not reliably predict the underlying biological response. Here, we present a multi-task deep learning approach that allows simultaneous tumor segmentation and response prediction. We design two Siamese subnetworks that are joined at multiple layers, which enables integration of multi-scale feature representations and in-depth comparison of pre-treatment and post-treatment images. The network is trained using 2568 magnetic resonance imaging scans of 321 rectal cancer patients for predicting pathologic complete response after neoadjuvant chemoradiotherapy. In multi-institution validation, the imaging-based model achieves AUC of 0.95 (95% confidence interval: 0.91–0.98) and 0.92 (0.87–0.96) in two independent cohorts of 160 and 141 patients, respectively. When combined with blood-based tumor markers, the integrated model further improves prediction accuracy with AUC 0.97 (0.93–0.99). Our approach to capturing dynamic information in longitudinal images may be broadly used for screening, treatment response evaluation, disease monitoring, and surveillance.
Suggested Citation
Cheng Jin & Heng Yu & Jia Ke & Peirong Ding & Yongju Yi & Xiaofeng Jiang & Xin Duan & Jinghua Tang & Daniel T. Chang & Xiaojian Wu & Feng Gao & Ruijiang Li, 2021.
"Predicting treatment response from longitudinal images using multi-task deep learning,"
Nature Communications, Nature, vol. 12(1), pages 1-11, December.
Handle:
RePEc:nat:natcom:v:12:y:2021:i:1:d:10.1038_s41467-021-22188-y
DOI: 10.1038/s41467-021-22188-y
Download full text from publisher
Citations
Citations are extracted by the
CitEc Project, subscribe to its
RSS feed for this item.
Cited by:
- Seungmin Lee & Jeong Soo Park & Hyowon Woo & Yong Kyoung Yoo & Dongho Lee & Seok Chung & Dae Sung Yoon & Ki- Baek Lee & Jeong Hoon Lee, 2024.
"Rapid deep learning-assisted predictive diagnostics for point-of-care testing,"
Nature Communications, Nature, vol. 15(1), pages 1-12, December.
- Yifan Zhong & Chuang Cai & Tao Chen & Hao Gui & Jiajun Deng & Minglei Yang & Bentong Yu & Yongxiang Song & Tingting Wang & Xiwen Sun & Jingyun Shi & Yangchun Chen & Dong Xie & Chang Chen & Yunlang She, 2023.
"PET/CT based cross-modal deep learning signature to predict occult nodal metastasis in lung cancer,"
Nature Communications, Nature, vol. 14(1), pages 1-14, December.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:12:y:2021:i:1:d:10.1038_s41467-021-22188-y. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.