TY - GEN
T1 - Stochastic Spiking Neural Networks with First-to-Spike Coding
AU - Jiang, Yi
AU - Lu, Sen
AU - Sengupta, Abhronil
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Spiking Neural Networks (SNNs), recognized as the third generation of neural networks, are known for their bio-plausibility and energy efficiency, especially when implemented on neuromorphic hardware. However, the majority of existing studies on SNNs have concentrated on deterministic neurons with rate coding, a method that incurs substantial computational overhead due to lengthy information integration times and fails to fully harness the brain's probabilistic inference capabilities and temporal dynamics. In this work, we explore the merger of novel computing and information encoding schemes in SNN architectures where we integrate stochastic spiking neuron models with temporal coding techniques. Through extensive benchmarking with other deterministic SNNs and rate-based coding, we investigate the tradeoffs of our proposal in terms of accuracy, inference latency, spiking sparsity, energy consumption, and robustness. Our work is the first to extend the scalability of direct training approaches of stochastic SNNs with temporal encoding to VGG architectures and beyond-MNIST datasets.
AB - Spiking Neural Networks (SNNs), recognized as the third generation of neural networks, are known for their bio-plausibility and energy efficiency, especially when implemented on neuromorphic hardware. However, the majority of existing studies on SNNs have concentrated on deterministic neurons with rate coding, a method that incurs substantial computational overhead due to lengthy information integration times and fails to fully harness the brain's probabilistic inference capabilities and temporal dynamics. In this work, we explore the merger of novel computing and information encoding schemes in SNN architectures where we integrate stochastic spiking neuron models with temporal coding techniques. Through extensive benchmarking with other deterministic SNNs and rate-based coding, we investigate the tradeoffs of our proposal in terms of accuracy, inference latency, spiking sparsity, energy consumption, and robustness. Our work is the first to extend the scalability of direct training approaches of stochastic SNNs with temporal encoding to VGG architectures and beyond-MNIST datasets.
UR - http://www.scopus.com/inward/record.url?scp=85214666308&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85214666308&partnerID=8YFLogxK
U2 - 10.1109/ICONS62911.2024.00012
DO - 10.1109/ICONS62911.2024.00012
M3 - Conference contribution
AN - SCOPUS:85214666308
T3 - Proceedings - 2024 International Conference on Neuromorphic Systems, ICONS 2024
SP - 24
EP - 31
BT - Proceedings - 2024 International Conference on Neuromorphic Systems, ICONS 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 International Conference on Neuromorphic Systems, ICONS 2024
Y2 - 30 July 2024 through 2 August 2024
ER -