Author
Listed:
- Shany Grossman
(Weizmann Institute of Science)
- Guy Gaziv
(Weizmann Institute of Science)
- Erin M. Yeagle
(Donald and Barbara Zucker School of Medicine at Hofstra/Northwell and Feinstein Institute for Medical Research)
- Michal Harel
(Weizmann Institute of Science)
- Pierre Mégevand
(Donald and Barbara Zucker School of Medicine at Hofstra/Northwell and Feinstein Institute for Medical Research
Geneva University Hospital and Faculty of Medicine)
- David M. Groppe
(Donald and Barbara Zucker School of Medicine at Hofstra/Northwell and Feinstein Institute for Medical Research
The Krembil Neuroscience Centre)
- Simon Khuvis
(Donald and Barbara Zucker School of Medicine at Hofstra/Northwell and Feinstein Institute for Medical Research)
- Jose L. Herrero
(Donald and Barbara Zucker School of Medicine at Hofstra/Northwell and Feinstein Institute for Medical Research)
- Michal Irani
(Weizmann Institute of Science)
- Ashesh D. Mehta
(Donald and Barbara Zucker School of Medicine at Hofstra/Northwell and Feinstein Institute for Medical Research)
- Rafael Malach
(Weizmann Institute of Science)
Abstract
The discovery that deep convolutional neural networks (DCNNs) achieve human performance in realistic tasks offers fresh opportunities for linking neuronal tuning properties to such tasks. Here we show that the face-space geometry, revealed through pair-wise activation similarities of face-selective neuronal groups recorded intracranially in 33 patients, significantly matches that of a DCNN having human-level face recognition capabilities. This convergent evolution of pattern similarities across biological and artificial networks highlights the significance of face-space geometry in face perception. Furthermore, the nature of the neuronal to DCNN match suggests a role of human face areas in pictorial aspects of face perception. First, the match was confined to intermediate DCNN layers. Second, presenting identity-preserving image manipulations to the DCNN abolished its correlation to neuronal responses. Finally, DCNN units matching human neuronal group tuning displayed view-point selective receptive fields. Our results demonstrate the importance of face-space geometry in the pictorial aspects of human face perception.
Suggested Citation
Shany Grossman & Guy Gaziv & Erin M. Yeagle & Michal Harel & Pierre Mégevand & David M. Groppe & Simon Khuvis & Jose L. Herrero & Michal Irani & Ashesh D. Mehta & Rafael Malach, 2019.
"Convergent evolution of face spaces across human face-selective neuronal groups and deep convolutional networks,"
Nature Communications, Nature, vol. 10(1), pages 1-13, December.
Handle:
RePEc:nat:natcom:v:10:y:2019:i:1:d:10.1038_s41467-019-12623-6
DOI: 10.1038/s41467-019-12623-6
Download full text from publisher
Citations
Citations are extracted by the
CitEc Project, subscribe to its
RSS feed for this item.
Cited by:
- Irina Higgins & Le Chang & Victoria Langston & Demis Hassabis & Christopher Summerfield & Doris Tsao & Matthew Botvinick, 2021.
"Unsupervised deep learning identifies semantic disentanglement in single inferotemporal face patch neurons,"
Nature Communications, Nature, vol. 12(1), pages 1-14, December.
- Annika Garlichs & Helen Blank, 2024.
"Prediction error processing and sharpening of expected information across the face-processing hierarchy,"
Nature Communications, Nature, vol. 15(1), pages 1-18, December.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:10:y:2019:i:1:d:10.1038_s41467-019-12623-6. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.