Author
Listed:
- Javad Noorbakhsh
(The Jackson Laboratory for Genomic Medicine)
- Saman Farahmand
(Computational Sciences PhD Program, University of Massachusetts-Boston)
- Ali Foroughi pour
(The Jackson Laboratory for Genomic Medicine)
- Sandeep Namburi
(The Jackson Laboratory for Genomic Medicine)
- Dennis Caruana
(Department of Pathology, Yale University School of Medicine)
- David Rimm
(Department of Pathology, Yale University School of Medicine)
- Mohammad Soltanieh-ha
(Department of Information Systems, Boston University)
- Kourosh Zarringhalam
(Computational Sciences PhD Program, University of Massachusetts-Boston
Department of Mathematics, University of Massachusetts-Boston)
- Jeffrey H. Chuang
(The Jackson Laboratory for Genomic Medicine
UCONN Health, Department of Genetics and Genome Sciences)
Abstract
Histopathological images are a rich but incompletely explored data type for studying cancer. Manual inspection is time consuming, making it challenging to use for image data mining. Here we show that convolutional neural networks (CNNs) can be systematically applied across cancer types, enabling comparisons to reveal shared spatial behaviors. We develop CNN architectures to analyze 27,815 hematoxylin and eosin scanned images from The Cancer Genome Atlas for tumor/normal, cancer subtype, and mutation classification. Our CNNs are able to classify TCGA pathologist-annotated tumor/normal status of whole slide images (WSIs) in 19 cancer types with consistently high AUCs (0.995 ± 0.008), as well as subtypes with lower but significant accuracy (AUC 0.87 ± 0.1). Remarkably, tumor/normal CNNs trained on one tissue are effective in others (AUC 0.88 ± 0.11), with classifier relationships also recapitulating known adenocarcinoma, carcinoma, and developmental biology. Moreover, classifier comparisons reveal intra-slide spatial similarities, with an average tile-level correlation of 0.45 ± 0.16 between classifier pairs. Breast cancers, bladder cancers, and uterine cancers have spatial patterns that are particularly easy to detect, suggesting these cancers can be canonical types for image analysis. Patterns for TP53 mutations can also be detected, with WSI self- and cross-tissue AUCs ranging from 0.65-0.80. Finally, we comparatively evaluate CNNs on 170 breast and colon cancer images with pathologist-annotated nuclei, finding that both cellular and intercellular regions contribute to CNN accuracy. These results demonstrate the power of CNNs not only for histopathological classification, but also for cross-comparisons to reveal conserved spatial behaviors across tumors.
Suggested Citation
Javad Noorbakhsh & Saman Farahmand & Ali Foroughi pour & Sandeep Namburi & Dennis Caruana & David Rimm & Mohammad Soltanieh-ha & Kourosh Zarringhalam & Jeffrey H. Chuang, 2020.
"Deep learning-based cross-classifications reveal conserved spatial behaviors within tumor histological images,"
Nature Communications, Nature, vol. 11(1), pages 1-14, December.
Handle:
RePEc:nat:natcom:v:11:y:2020:i:1:d:10.1038_s41467-020-20030-5
DOI: 10.1038/s41467-020-20030-5
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:11:y:2020:i:1:d:10.1038_s41467-020-20030-5. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.