Converting a neural network into a SNN: softmax activation function

Hi everyone,

I have a question about converting a neural network into a SNN using NengoDL.

  1. In the example ( ), the output layers lack the softmax layer and any activation function. However, in the probability plot, “tf.nn.softmax” is used. Isn’t it the same as to use an activation function in the output layer? Also, if I train the SNN with a softmax layer, can I convert the entire neural network to a neuromorphic hardware like Loihi? And in this scenario, can it be considered a complete SNN?
  2. How can I verify if the neural network correctly computes the class without the softmax layer?
  3. The paper ( ) mentions implementing the approach “entirely within a spiking neural network simulated on low-power neuromorphic hardware”. It also states that the model learns the mapping from features to textures using a Support Vector Machine (SVM). Is using SVM truly part of an entirely SNN? And what’s the rationale behind using SVM instead of completing the task within Nengo?

Thank you for your attention and assistance.