Enhancing Training of Spiking Neural Network with Stochastic Latency
Document Type
Conference Proceeding
Publication Title
Proceedings of the AAAI Conference on Artificial Intelligence
Abstract
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption when deployed on neuromorphic hardware that operates in orders of magnitude lower power than general-purpose hardware. Direct training methods for SNNs come with an inherent latency for which the SNNs are optimized, and in general, the higher the latency, the better the predictive powers of the models, but at the same time, the higher the energy consumption during training and inference. Furthermore, an SNN model optimized for one particular latency does not necessarily perform well in lower latencies, which becomes relevant in scenarios where it is necessary to switch to a lower latency because of the depletion of onboard energy or other operational requirements. In this work, we propose Stochastic Latency Training (SLT), a direct training method for SNNs that optimizes the model for the given latency but simultaneously offers a minimum reduction of predictive accuracy when shifted to lower inference latencies. We provide heuristics for our approach with partial theoretical justification and experimental evidence showing the state-of-the-art performance of our models on datasets such as CIFAR-10, DVS-CIFAR-10, CIFAR-100, and DVS-Gesture. Our code is available at https://github.com/srinuvaasu/SLT.
First Page
10900
Last Page
10908
DOI
10.1609/aaai.v38i10.28964
Publication Date
3-25-2024
Recommended Citation
S. Anumasa et al., "Enhancing Training of Spiking Neural Network with Stochastic Latency," Proceedings of the AAAI Conference on Artificial Intelligence, vol. 38, no. 10, pp. 10900 - 10908, Mar 2024.
The definitive version is available at https://doi.org/10.1609/aaai.v38i10.28964