IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v13y2022i1d10.1038_s41467-022-29632-7.html
   My bibliography  Save this article

The neural coding framework for learning generative models

Author

Listed:
  • Alexander Ororbia

    (Rochester Institute of Technology)

  • Daniel Kifer

    (The Pennsylvania State University)

Abstract

Neural generative models can be used to learn complex probability distributions from data, to sample from them, and to produce probability density estimates. We propose a computational framework for developing neural generative models inspired by the theory of predictive processing in the brain. According to predictive processing theory, the neurons in the brain form a hierarchy in which neurons in one level form expectations about sensory inputs from another level. These neurons update their local models based on differences between their expectations and the observed signals. In a similar way, artificial neurons in our generative models predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality. In this work, we show that the neural generative models learned within our framework perform well in practice across several benchmark datasets and metrics and either remain competitive with or significantly outperform other generative models with similar functionality (such as the variational auto-encoder).

Suggested Citation

  • Alexander Ororbia & Daniel Kifer, 2022. "The neural coding framework for learning generative models," Nature Communications, Nature, vol. 13(1), pages 1-14, December.
  • Handle: RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-022-29632-7
    DOI: 10.1038/s41467-022-29632-7
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-022-29632-7
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-022-29632-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Karl Friston, 2008. "Hierarchical Models in the Brain," PLOS Computational Biology, Public Library of Science, vol. 4(11), pages 1-24, November.
    2. Anthony M. Zador, 2019. "A critique of pure learning and what artificial neural networks can learn from animal brains," Nature Communications, Nature, vol. 10(1), pages 1-7, December.
    3. Timothy P. Lillicrap & Daniel Cownden & Douglas B. Tweed & Colin J. Akerman, 2016. "Random synaptic feedback weights support error backpropagation for deep learning," Nature Communications, Nature, vol. 7(1), pages 1-10, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Yao, Zhao & Sun, Kehui & Wang, Huihai, 2024. "Collective behaviors of fractional-order FithzHugh–Nagumo network," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 639(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Federico Bolaños & Javier G. Orlandi & Ryo Aoki & Akshay V. Jagadeesh & Justin L. Gardner & Andrea Benucci, 2024. "Efficient coding of natural images in the mouse visual cortex," Nature Communications, Nature, vol. 15(1), pages 1-17, December.
    2. Micha Heilbron & Florent Meyniel, 2019. "Confidence resets reveal hierarchical adaptive learning in humans," PLOS Computational Biology, Public Library of Science, vol. 15(4), pages 1-24, April.
    3. John C. Boik, 2020. "Science-Driven Societal Transformation, Part I: Worldview," Sustainability, MDPI, vol. 12(17), pages 1-28, August.
    4. Bossert, Leonie & Hagendorff, Thilo, 2021. "Animals and AI. The role of animals in AI research and application – An overview and ethical evaluation," Technology in Society, Elsevier, vol. 67(C).
    5. Mateus Joffily & Giorgio Coricelli, 2013. "Emotional Valence and the Free-Energy Principle," Post-Print halshs-00834063, HAL.
    6. Falk Lieder & Klaas E Stephan & Jean Daunizeau & Marta I Garrido & Karl J Friston, 2013. "A Neurocomputational Model of the Mismatch Negativity," PLOS Computational Biology, Public Library of Science, vol. 9(11), pages 1-14, November.
    7. Francesco Poli & Yi-Lin Li & Pravallika Naidu & Rogier B. Mars & Sabine Hunnius & Azzurra Ruggeri, 2024. "Toddlers strategically adapt their information search," Nature Communications, Nature, vol. 15(1), pages 1-10, December.
    8. Jaroslav Vítků & Petr Dluhoš & Joseph Davidson & Matěj Nikl & Simon Andersson & Přemysl Paška & Jan Šinkora & Petr Hlubuček & Martin Stránský & Martin Hyben & Martin Poliak & Jan Feyereisl & Marek Ros, 2020. "ToyArchitecture: Unsupervised learning of interpretable models of the environment," PLOS ONE, Public Library of Science, vol. 15(5), pages 1-50, May.
    9. Ünsal Özdilek, 2021. "Sensing Happiness in Senseless Information," Applied Research in Quality of Life, Springer;International Society for Quality-of-Life Studies, vol. 16(5), pages 2059-2084, October.
    10. Keitaro Obara & Teppei Ebina & Shin-Ichiro Terada & Takanori Uka & Misako Komatsu & Masafumi Takaji & Akiya Watakabe & Kenta Kobayashi & Yoshito Masamizu & Hiroaki Mizukami & Tetsuo Yamamori & Kiyoto , 2023. "Change detection in the primate auditory cortex through feedback of prediction error signals," Nature Communications, Nature, vol. 14(1), pages 1-17, December.
    11. Giorgia Dellaferrera & Stanisław Woźniak & Giacomo Indiveri & Angeliki Pantazi & Evangelos Eleftheriou, 2022. "Introducing principles of synaptic integration in the optimization of deep neural networks," Nature Communications, Nature, vol. 13(1), pages 1-14, December.
    12. Adeeti Aggarwal & Connor Brennan & Jennifer Luo & Helen Chung & Diego Contreras & Max B. Kelz & Alex Proekt, 2022. "Visual evoked feedforward–feedback traveling waves organize neural activity across the cortical hierarchy in mice," Nature Communications, Nature, vol. 13(1), pages 1-16, December.
    13. Dileep George & Jeff Hawkins, 2009. "Towards a Mathematical Theory of Cortical Micro-circuits," PLOS Computational Biology, Public Library of Science, vol. 5(10), pages 1-26, October.
    14. Barbara Feulner & Matthew G. Perich & Raeed H. Chowdhury & Lee E. Miller & Juan A. Gallego & Claudia Clopath, 2022. "Small, correlated changes in synaptic connectivity may facilitate rapid motor learning," Nature Communications, Nature, vol. 13(1), pages 1-14, December.
    15. Mitsumasa Nakajima & Katsuma Inoue & Kenji Tanaka & Yasuo Kuniyoshi & Toshikazu Hashimoto & Kohei Nakajima, 2022. "Physical deep learning with biologically inspired training method: gradient-free approach for physical hardware," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    16. Bao, Han & Yu, Xihong & Zhang, Yunzhen & Liu, Xiaofeng & Chen, Mo, 2023. "Initial condition-offset regulating synchronous dynamics and energy diversity in a memristor-coupled network of memristive HR neurons," Chaos, Solitons & Fractals, Elsevier, vol. 177(C).
    17. Ertam, Fatih, 2019. "An efficient hybrid deep learning approach for internet security," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 535(C).
    18. Robert Rosenbaum, 2022. "On the relationship between predictive coding and backpropagation," PLOS ONE, Public Library of Science, vol. 17(3), pages 1-27, March.
    19. Navid Shervani-Tabar & Robert Rosenbaum, 2023. "Meta-learning biologically plausible plasticity rules with random feedback pathways," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    20. David Balduzzi & Giulio Tononi, 2009. "Qualia: The Geometry of Integrated Information," PLOS Computational Biology, Public Library of Science, vol. 5(8), pages 1-24, August.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-022-29632-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.