4 layer SNN using Nengo Core (2.80)

Hi kvemuru,

If you’d like to do a four-layer SNN, my guess is that you want something that is trained end-to-end with backpropagation, as is done with a traditional ANN. For that, you’ll need to use NengoDL. There’s a NengoDL spiking MNIST example that can get you started there.

The code that you’ve posted looks like it’s taken from the NengoExtras single-layer MNIST example here. That example takes a bit of a different approach: it has a single layer, and does not train the network end-to-end. It uses a fixed set of weights from the input to the hidden layer, which we call the “encoders”. We then learn the weights from the hidden layer to the output, called the “decoders”, to minimize our classification loss. This method is based of the principles of the Neural Engineering Framework, which you can learn more about here.

In that example, I show a number of different ways of choosing the encoders, and the idea is to choose one of them. The “sparse Gabor filter encoders” perform the best, so that’s probably the distribution you want to use.

Let me know which approach you’re more interested in, and if you have more questions.

1 Like