The neural coding framework for learning generative models

Alexander Ororbia, Daniel Kifer

Research output: Contribution to journalArticlepeer-review

41 Scopus citations

Abstract

Neural generative models can be used to learn complex probability distributions from data, to sample from them, and to produce probability density estimates. We propose a computational framework for developing neural generative models inspired by the theory of predictive processing in the brain. According to predictive processing theory, the neurons in the brain form a hierarchy in which neurons in one level form expectations about sensory inputs from another level. These neurons update their local models based on differences between their expectations and the observed signals. In a similar way, artificial neurons in our generative models predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality. In this work, we show that the neural generative models learned within our framework perform well in practice across several benchmark datasets and metrics and either remain competitive with or significantly outperform other generative models with similar functionality (such as the variational auto-encoder).

Original languageEnglish (US)
Article number2064
JournalNature communications
Volume13
Issue number1
DOIs
StatePublished - Dec 2022

All Science Journal Classification (ASJC) codes

  • General Chemistry
  • General Biochemistry, Genetics and Molecular Biology
  • General Physics and Astronomy

Fingerprint

Dive into the research topics of 'The neural coding framework for learning generative models'. Together they form a unique fingerprint.

Cite this