Multiplications in Nengo SNN

Hi,

The hardware implementation of SNNs does not posses multipliers. However, Nengo converts CNN to SNN. Each convolutional layer is followed by LIF layer and each convolutional layer has multiplication and addition operations. So, I was wondering if we can still consider the multiplierless design of SNNs with NengoLoihi especially.

Since the spikes used by SNNs are binary (either a neuron spikes or it doesn’t, there is no magnitude associated with the spikes), spikes can always be viewed as having a value of “1”. Since any number multiplied by “1” is that same number, it means that no multiplication has to be done as part of the weight update; rather, if an input neuron has spiked, then the relevant weights are simply added to the relevant neurons.

When we convert a CNN to an SNN, we turn the rate output of the neurons into spiking outputs. For example, a neuron in a CNN may have an output of 250. For the SNN, we interpret this as 250 Hz, and assuming our timestep is 1 ms (i.e. 1000 timesteps per second), that means the spiking neuron will spike once every 4 timesteps. So whereas in the CNN, we would have to multiply the value of 250 by all the weights in the following layer, in the SNN, we just have a series of spikes and so the weight updates can be done with addition.

So the overall answer is yes, NengoLoihi does produce networks that are compatible with the multiplierless design of Loihi. (That said, I don’t think Loihi is completely multiplier-free. I believe that the decay in the current u and voltage v of a neuron involve a multiplier. But you’re correct that the weight updates do not require a multiplier.)

I hope that answers your question. Let me know if anything is unclear, or if there were any parts of the question that I didn’t answer or that I misunderstood.

1 Like

I was thinking the same but when I find the minimum and maximum of the output of any LIF layer I get 0 and 9.999999 respectively. I was wondering why it is not 0 and 1 only?

I am using the Nengo code available online for MNIST classification. It works fine for the classification accuracy.

There’s a lot of scaling that goes on in various places. One of these is to scale spikes by 1 / dt when reporting them to the user. This has the nice property that when you take the mean of a neuron’s output over all timesteps, it will give you the average firing rate of that neuron in Hz; it also keeps things consistent with how nengo.Simulator does it. Another scaling is the amplitude attribute on neuron types, that allows for scaling up or down the output of a neuron. We use this when training SNNs because initial weights are typically configured assuming that neurons have outputs in the range of [0, 1]. For that MNIST example, I think we use dt=0.001 and amplitude=0.01, so amplitude / dt = 10 (which corresponds to the 9.999999 you’re getting). All those scaling factors get folded into the weights when we actually map things to Loihi (i.e. rather than having spikes transmit a value of 10, we scale up all the weights by a factor of 10 to get the same effect).

If you’re interested in computing the firing rates, just look at when neurons have an output > 0, because that indicates when it’s spiked.

1 Like

Thankyou so much for such a response in detail.