IDEAS home Printed from https://ideas.repec.org/a/plo/pcbi00/1002123.html
   My bibliography  Save this article

Understanding Auditory Spectro-Temporal Receptive Fields and Their Changes with Input Statistics by Efficient Coding Principles

Author

Listed:
  • Lingyun Zhao
  • Li Zhaoping

Abstract

Spectro-temporal receptive fields (STRFs) have been widely used as linear approximations to the signal transform from sound spectrograms to neural responses along the auditory pathway. Their dependence on statistical attributes of the stimuli, such as sound intensity, is usually explained by nonlinear mechanisms and models. Here, we apply an efficient coding principle which has been successfully used to understand receptive fields in early stages of visual processing, in order to provide a computational understanding of the STRFs. According to this principle, STRFs result from an optimal tradeoff between maximizing the sensory information the brain receives, and minimizing the cost of the neural activities required to represent and transmit this information. Both terms depend on the statistical properties of the sensory inputs and the noise that corrupts them. The STRFs should therefore depend on the input power spectrum and the signal-to-noise ratio, which is assumed to increase with input intensity. We analytically derive the optimal STRFs when signal and noise are approximated as Gaussians. Under the constraint that they should be spectro-temporally local, the STRFs are predicted to adapt from being band-pass to low-pass filters as the input intensity reduces, or the input correlation becomes longer range in sound frequency or time. These predictions qualitatively match physiological observations. Our prediction as to how the STRFs should be determined by the input power spectrum could readily be tested, since this spectrum depends on the stimulus ensemble. The potentials and limitations of the efficient coding principle are discussed. Author Summary: Spectro-temporal receptive fields (STRFs) have been widely used as linear approximations of the signal transform from sound spectrograms to neural responses along the auditory pathway. Their dependence on the ensemble of input stimuli has usually been examined mechanistically as a possibly complex nonlinear process. We propose that the STRFs and their dependence on the input ensemble can be understood by an efficient coding principle, according to which the responses of the encoding neurons report the maximum amount of information about the sensory input, subject to limits on the neural cost in representing and transmitting information. This proposal is inspired by the success of the same principle in accounting for receptive fields in the early stages of the visual pathway and their adaptation to input statistics. The principle can account for the STRFs that have been observed, and the way they change with sound intensity. Further, it predicts how the STRFs should change with input correlations, an issue that has not been extensively investigated. In sum, our study provides a computational understanding of the neural transformations of auditory inputs, and makes testable predictions for future experiments.

Suggested Citation

  • Lingyun Zhao & Li Zhaoping, 2011. "Understanding Auditory Spectro-Temporal Receptive Fields and Their Changes with Input Statistics by Efficient Coding Principles," PLOS Computational Biology, Public Library of Science, vol. 7(8), pages 1-16, August.
  • Handle: RePEc:plo:pcbi00:1002123
    DOI: 10.1371/journal.pcbi.1002123
    as

    Download full text from publisher

    File URL: https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1002123
    Download Restriction: no

    File URL: https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1002123&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pcbi.1002123?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Israel Nelken & Yaron Rotman & Omer Bar Yosef, 1999. "Responses of auditory-cortex neurons to structural features of natural sounds," Nature, Nature, vol. 397(6715), pages 154-157, January.
    2. Jan W. H. Schnupp & Thomas D. Mrsic-Flogel & Andrew J. King, 2001. "Linear processing of spatial cues in primary auditory cortex," Nature, Nature, vol. 414(6860), pages 200-204, November.
    3. Evan C. Smith & Michael S. Lewicki, 2006. "Efficient auditory coding," Nature, Nature, vol. 439(7079), pages 978-982, February.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jacob N Oppenheim & Pavel Isakov & Marcelo O Magnasco, 2013. "Degraded Time-Frequency Acuity to Time-Reversed Notes," PLOS ONE, Public Library of Science, vol. 8(6), pages 1-6, June.
    2. Noga Mosheiff & Haggai Agmon & Avraham Moriel & Yoram Burak, 2017. "An efficient coding theory for a dynamic trajectory predicts non-uniform allocation of entorhinal grid cells to modules," PLOS Computational Biology, Public Library of Science, vol. 13(6), pages 1-19, June.
    3. Sam V Norman-Haignere & Josh H McDermott, 2018. "Neural responses to natural and model-matched stimuli reveal distinct computations in primary and nonprimary auditory cortex," PLOS Biology, Public Library of Science, vol. 16(12), pages 1-46, December.
    4. Jonathan J Hunt & Peter Dayan & Geoffrey J Goodhill, 2013. "Sparse Coding Can Predict Primary Visual Cortex Receptive Field Changes Induced by Abnormal Visual Input," PLOS Computational Biology, Public Library of Science, vol. 9(5), pages 1-17, May.
    5. Lubomir Kostal & Petr Lansky & Jean-Pierre Rospars, 2008. "Efficient Olfactory Coding in the Pheromone Receptor Neuron of a Moth," PLOS Computational Biology, Public Library of Science, vol. 4(4), pages 1-11, April.
    6. Julie E Elie & Frédéric E Theunissen, 2019. "Invariant neural responses for sensory categories revealed by the time-varying information for communication calls," PLOS Computational Biology, Public Library of Science, vol. 15(9), pages 1-43, September.
    7. Jonathan Schaffner & Sherry Dongqi Bao & Philippe N. Tobler & Todd A. Hare & Rafael Polania, 2023. "Sensory perception relies on fitness-maximizing codes," Nature Human Behaviour, Nature, vol. 7(7), pages 1135-1151, July.
    8. Roohollah Massoudi & Marc M Van Wanrooij & Huib Versnel & A John Van Opstal, 2015. "Spectrotemporal Response Properties of Core Auditory Cortex Neurons in Awake Monkey," PLOS ONE, Public Library of Science, vol. 10(2), pages 1-30, February.
    9. Gonzalo H Otazu & Christian Leibold, 2011. "A Corticothalamic Circuit Model for Sound Identification in Complex Scenes," PLOS ONE, Public Library of Science, vol. 6(9), pages 1-15, September.
    10. Clara Suied & Isabelle Viaud-Delmon, 2009. "Auditory-Visual Object Recognition Time Suggests Specific Processing for Animal Sounds," PLOS ONE, Public Library of Science, vol. 4(4), pages 1-9, April.
    11. Tomas Barta & Lubomir Kostal, 2019. "The effect of inhibition on rate code efficiency indicators," PLOS Computational Biology, Public Library of Science, vol. 15(12), pages 1-21, December.
    12. Mina Sadeghi & Xiu Zhai & Ian H Stevenson & Monty A Escabí, 2019. "A neural ensemble correlation code for sound category identification," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-41, October.
    13. Philippe Albouy & Samuel A. Mehr & Roxane S. Hoyer & Jérémie Ginzburg & Yi Du & Robert J. Zatorre, 2024. "Spectro-temporal acoustical markers differentiate speech from song across cultures," Nature Communications, Nature, vol. 15(1), pages 1-13, December.
    14. Oded Barzelay & Miriam Furst & Omri Barak, 2017. "A New Approach to Model Pitch Perception Using Sparse Coding," PLOS Computational Biology, Public Library of Science, vol. 13(1), pages 1-36, January.
    15. Klaus Wimmer & K Jannis Hildebrandt & R Matthias Hennig & Klaus Obermayer, 2008. "Adaptation and Selective Information Transmission in the Cricket Auditory Neuron AN2," PLOS Computational Biology, Public Library of Science, vol. 4(9), pages 1-18, September.
    16. Joseph D. Zak & Gautam Reddy & Vaibhav Konanur & Venkatesh N. Murthy, 2024. "Distinct information conveyed to the olfactory bulb by feedforward input from the nose and feedback from the cortex," Nature Communications, Nature, vol. 15(1), pages 1-16, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pcbi00:1002123. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ploscompbiol (email available below). General contact details of provider: https://journals.plos.org/ploscompbiol/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.