Optimizing SNN using metaheuristic


Im keen to explore optimizing SNN parameters using metaheurisric algorithm such as bat algorithm optimizer… besides firing rate is there any other parameter that can be optimize?

Hi @fendie,

as far as I know, in Nengo DL you can choose to train weights, biases, and encoders of your network. I do think that there is a way to also add custom trainable parameters via tensorflow, although I have never actually done this. You might want to check out this or this entry from the documentation page. Hope you find what you need.

From my understanding the weights and biases in
SNN are tune just like ANN. Therefore, I’m not looking to optimize ann weights and biases because the ANN model that i want to inplement is quite deep thus a lot of wieghts and biases therefore take very long to optimise. So can i know any parameters specific to SNN that can be optimise?

I’m new in SNN please correct my understanding if necessary.

In NengoDL, SNNs are very much like ANN, in terms of the way they are optimized. In fact, during the training process, NengoDL will use the rate model of neurons for training, essentially “converting” the SNN into an ANN. The one other thing you can optimize (apart from the weights and biases) in an SNN is the spiking rate of the neurons themselves. However, that’s really just another way of determining what weights and biases to use (the weights and biases determine the spike rate of the neurons). You can see an example of this optimization done in this NengoDL example (see the part about spiking rate regularization).