Data Driven Threshold and Potential Initialization for Spiking Neural Networks
Document Type
Conference Proceeding
Publication Title
Proceedings of Machine Learning Research
Abstract
Spiking neural networks (SNNs) present an increasingly popular alternative to artificial neural networks (ANNs), due to their energy and time efficiency when deployed on neuromorphic hardware. However, due to their discrete and highly non-differentiable nature, training SNNs is a challenging task and remains an active area of research. Some of the most prominent ways to train SNNs are based on ANN-to-SNN conversion where an SNN model is initialized with parameters from the corresponding, pre-trained ANN model. SNN models trained through ANN-to-SNN conversion or hybrid training show state of the art performance among SNNs on many machine learning tasks, comparable to those of ANNs. However, the top performing models need high latency or tailored ANNs to perform well, and in general are not using the full information available from ANNs. In this work, we propose novel method to initialize SNN’s thresholds and initial membrane potential after ANN-to-SNN conversion, using distributions of ANN’s activation values. We provide a theoretical framework for feature distribution-based conversion error, providing theoretical results on optimal membrane initialization and thresholds which minimize this error, as well as a practical algorithm for finding these optimal values. We test our method, both as a stand-alone ANN-to-SNN conversion and in combination with other methods, and show state of the art results on high-dimensional datasets such as CIFAR10, CIFAR100 and ImageNet and various architectures. Our code is available at https://github.com/srinuvaasu/data_driven_init.
First Page
4771
Last Page
4779
Publication Date
1-1-2024
Recommended Citation
V. Bojković et al., "Data Driven Threshold and Potential Initialization for Spiking Neural Networks," Proceedings of Machine Learning Research, vol. 238, pp. 4771 - 4779, Jan 2024.