Gaussian synapses for probabilistic neural networks

Amritanand Sebastian, Andrew Pannone, Shiva Subbulakshmi Radhakrishnan, Saptarshi Das

Research output: Contribution to journalArticlepeer-review

83 Scopus citations


The recent decline in energy, size and complexity scaling of traditional von Neumann architecture has resurrected considerable interest in brain-inspired computing. Artificial neural networks (ANNs) based on emerging devices, such as memristors, achieve brain-like computing but lack energy-efficiency. Furthermore, slow learning, incremental adaptation, and false convergence are unresolved challenges for ANNs. In this article we, therefore, introduce Gaussian synapses based on heterostructures of atomically thin two-dimensional (2D) layered materials, namely molybdenum disulfide and black phosphorus field effect transistors (FETs), as a class of analog and probabilistic computational primitives for hardware implementation of statistical neural networks. We also demonstrate complete tunability of amplitude, mean and standard deviation of the Gaussian synapse via threshold engineering in dual gated molybdenum disulfide and black phosphorus FETs. Finally, we show simulation results for classification of brainwaves using Gaussian synapse based probabilistic neural networks.

Original languageEnglish (US)
Article number4199
JournalNature communications
Issue number1
StatePublished - Dec 1 2019

All Science Journal Classification (ASJC) codes

  • General Chemistry
  • General Biochemistry, Genetics and Molecular Biology
  • General Physics and Astronomy


Dive into the research topics of 'Gaussian synapses for probabilistic neural networks'. Together they form a unique fingerprint.

Cite this