Skip to yearly menu bar Skip to main content


Poster

Data Driven Threshold and Potential Initialization for Spiking Neural Networks

Velibor Bojkovic · srinivas anumasa · Giulia De Masi · Bin Gu · Huan Xiong

MR1 & MR2 - Number 118
[ ]
[ Poster
Sat 4 May 6 a.m. PDT — 8:30 a.m. PDT

Abstract:

Spiking neural networks (SNNs) present an increasingly popular alternative to artificial neural networks (ANNs), due to their energy and time efficiency when deployed on neuromorphic hardware. However, due to their discrete and highly non-differentiable nature, training SNNs is a challenging task and remains an active area of research. Some of the most prominent ways to train SNNs are based on ANN-to-SNN conversion where an SNN model is initialized with parameters from the corresponding, pre-trained ANN model. SNN models trained through ANN-to-SNN conversion or hybrid training show state of the art performance among SNNs on many machine learning tasks, comparable to those of ANNs. However, the top performing models need high latency or tailored ANNs to perform well, and in general are not using the full information available from ANNs. In this work, we propose novel method to initialize SNN's thresholds and initial membrane potential after ANN-to-SNN conversion, using distributions of ANN's activation values. We provide a theoretical framework for feature distribution-based conversion error, providing theoretical results on optimal membrane initialization and thresholds which minimize this error, as well as a practical algorithm for finding these optimal values. We test our method, both as a stand-alone ANN-to-SNN conversion and in combination with other methods, and show state of the art results on high-dimensional datasets such as CIFAR10, CIFAR100 and ImageNet and various architectures. Our code is available at \url{https://github.com/srinuvaasu/datadriveninit}

Chat is not available.