Author
Listed:
- Shi Tong Liu
(University of Pittsburgh)
- Pilar Montes-Lourido
(University of Pittsburgh)
- Xiaoqin Wang
(Johns Hopkins University)
- Srivatsun Sadagopan
(University of Pittsburgh
University of Pittsburgh
University of Pittsburgh)
Abstract
Humans and vocal animals use vocalizations to communicate with members of their species. A necessary function of auditory perception is to generalize across the high variability inherent in vocalization production and classify them into behaviorally distinct categories (‘words’ or ‘call types’). Here, we demonstrate that detecting mid-level features in calls achieves production-invariant classification. Starting from randomly chosen marmoset call features, we use a greedy search algorithm to determine the most informative and least redundant features necessary for call classification. High classification performance is achieved using only 10–20 features per call type. Predictions of tuning properties of putative feature-selective neurons accurately match some observed auditory cortical responses. This feature-based approach also succeeds for call categorization in other species, and for other complex classification tasks such as caller identification. Our results suggest that high-level neural representations of sounds are based on task-dependent features optimized for specific computational goals.
Suggested Citation
Shi Tong Liu & Pilar Montes-Lourido & Xiaoqin Wang & Srivatsun Sadagopan, 2019.
"Optimal features for auditory categorization,"
Nature Communications, Nature, vol. 10(1), pages 1-14, December.
Handle:
RePEc:nat:natcom:v:10:y:2019:i:1:d:10.1038_s41467-019-09115-y
DOI: 10.1038/s41467-019-09115-y
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:10:y:2019:i:1:d:10.1038_s41467-019-09115-y. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.