Author
Listed:
- Oliver Schoppe
(Technical University of Munich
Technical University of Munich
Helmholtz Zentrum München)
- Chenchen Pan
(Helmholtz Zentrum München
University Hospital)
- Javier Coronel
(Technical University of Munich
Technical University of Munich)
- Hongcheng Mai
(Helmholtz Zentrum München
University Hospital)
- Zhouyi Rong
(Helmholtz Zentrum München
University Hospital)
- Mihail Ivilinov Todorov
(Helmholtz Zentrum München
University Hospital
Graduate School of Systemic Neurosciences (GSN))
- Annemarie Müskes
(Charité, Universitätsmedizin Berlin)
- Fernando Navarro
(Technical University of Munich
Technical University of Munich)
- Hongwei Li
(Technical University of Munich)
- Ali Ertürk
(Helmholtz Zentrum München
University Hospital
Munich Cluster for Systems Neurology (SyNergy))
- Bjoern H. Menze
(Technical University of Munich
Technical University of Munich
Technical University of Munich
University of Zurich)
Abstract
Whole-body imaging of mice is a key source of information for research. Organ segmentation is a prerequisite for quantitative analysis but is a tedious and error-prone task if done manually. Here, we present a deep learning solution called AIMOS that automatically segments major organs (brain, lungs, heart, liver, kidneys, spleen, bladder, stomach, intestine) and the skeleton in less than a second, orders of magnitude faster than prior algorithms. AIMOS matches or exceeds the segmentation quality of state-of-the-art approaches and of human experts. We exemplify direct applicability for biomedical research for localizing cancer metastases. Furthermore, we show that expert annotations are subject to human error and bias. As a consequence, we show that at least two independently created annotations are needed to assess model performance. Importantly, AIMOS addresses the issue of human bias by identifying the regions where humans are most likely to disagree, and thereby localizes and quantifies this uncertainty for improved downstream analysis. In summary, AIMOS is a powerful open-source tool to increase scalability, reduce bias, and foster reproducibility in many areas of biomedical research.
Suggested Citation
Oliver Schoppe & Chenchen Pan & Javier Coronel & Hongcheng Mai & Zhouyi Rong & Mihail Ivilinov Todorov & Annemarie Müskes & Fernando Navarro & Hongwei Li & Ali Ertürk & Bjoern H. Menze, 2020.
"Deep learning-enabled multi-organ segmentation in whole-body mouse scans,"
Nature Communications, Nature, vol. 11(1), pages 1-14, December.
Handle:
RePEc:nat:natcom:v:11:y:2020:i:1:d:10.1038_s41467-020-19449-7
DOI: 10.1038/s41467-020-19449-7
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:11:y:2020:i:1:d:10.1038_s41467-020-19449-7. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.