Author
Listed:
- Dustin Garner
(University of California Santa Barbara)
- Emil Kind
(Freie Universität Berlin)
- Jennifer Yuet Ha Lai
(University of California Santa Barbara)
- Aljoscha Nern
(Howard Hughes Medical Institute)
- Arthur Zhao
(Howard Hughes Medical Institute)
- Lucy Houghton
(University of California Santa Barbara)
- Gizem Sancer
(Freie Universität Berlin
Yale University)
- Tanya Wolff
(Howard Hughes Medical Institute)
- Gerald M. Rubin
(Howard Hughes Medical Institute)
- Mathias F. Wernet
(Freie Universität Berlin)
- Sung Soo Kim
(University of California Santa Barbara
University of California Santa Barbara
University of California Santa Barbara)
Abstract
Many animals use visual information to navigate1–4, but how such information is encoded and integrated by the navigation system remains incompletely understood. In Drosophila melanogaster, EPG neurons in the central complex compute the heading direction5 by integrating visual input from ER neurons6–12, which are part of the anterior visual pathway (AVP)10,13–16. Here we densely reconstruct all neurons in the AVP using electron-microscopy data17. The AVP comprises four neuropils, sequentially linked by three major classes of neurons: MeTu neurons10,14,15, which connect the medulla in the optic lobe to the small unit of the anterior optic tubercle (AOTUsu) in the central brain; TuBu neurons9,16, which connect the AOTUsu to the bulb neuropil; and ER neurons6–12, which connect the bulb to the EPG neurons. On the basis of morphologies, connectivity between neural classes and the locations of synapses, we identify distinct information channels that originate from four types of MeTu neurons, and we further divide these into ten subtypes according to the presynaptic connections in the medulla and the postsynaptic connections in the AOTUsu. Using the connectivity of the entire AVP and the dendritic fields of the MeTu neurons in the optic lobes, we infer potential visual features and the visual area from which any ER neuron receives input. We confirm some of these predictions physiologically. These results provide a strong foundation for understanding how distinct sensory features can be extracted and transformed across multiple processing stages to construct higher-order cognitive representations.
Suggested Citation
Dustin Garner & Emil Kind & Jennifer Yuet Ha Lai & Aljoscha Nern & Arthur Zhao & Lucy Houghton & Gizem Sancer & Tanya Wolff & Gerald M. Rubin & Mathias F. Wernet & Sung Soo Kim, 2024.
"Connectomic reconstruction predicts visual features used for navigation,"
Nature, Nature, vol. 634(8032), pages 181-190, October.
Handle:
RePEc:nat:nature:v:634:y:2024:i:8032:d:10.1038_s41586-024-07967-z
DOI: 10.1038/s41586-024-07967-z
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:nature:v:634:y:2024:i:8032:d:10.1038_s41586-024-07967-z. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.