Number of parameters in SNN and CNN

Hi everyone!
I created a nengo network and another non-spiking network with the same architecture. Is it normal to have approximately the same amount of trainable parameters but an enormous higher amount of non-trainable parameters for the spiking neural network?
My non-spiking network has 3,085,155 trainable and 486 non-trainable parameters. My spiking neural network has 3,354,981 trainable and 174,016,834 non-trainable parameters.
I am new to the world of spiking neural networks and would like to understand why the amount of non-trainable parameters differ so much.

Thank you so much in advance!

That’s how I calculate the number of parameter for the spiking neural network:
nengo_params = sum(
np.prod(s.shape) for s in sim.keras_model.weights)
nengo_trainable_params = sum(
np.prod(w.shape) for w in sim.keras_model.trainable_weights)

Hi @Louuiissaa and welcome to the forum!

Great question. Currently (at time of writing), the most recent release of Nengo-DL includes many variables in the non-trainable parameters that aren’t traditionally called “parameters”, such as the state that the simulator is maintaining from one time-step to the next.

That said, the current version of master (since this PR has been merged) has since separated out those variables from the non-trainable parameters. And so, if you run the command:

pip install git+https://github.com/nengo/nengo-dl

then that will install the master branch of Nengo-DL, which will give you the expected non-trainable parameter count. This step will not be needed once the next release occurs. Let us know if that works, thanks!

Edit: As a correction, nengo-dl>=3.2.0 is sufficient since the change has been released (you don’t need the master branch necessarily).

Thank you @arvoelke for the quick reply!
I have installed the master branch of Nengo-DL as you said. Now the ration of trainable and non-trainable parameters does make more sense. Now I got 174,016,834 trainable and 3,354,981 non-trainable parameters. However, I still have not fully understood why a spiking neural network has such a high number of parameters compared to a standard convolutional neural network with the same overall architecture? Is it because spiking neural networks incorporates the concept of time into the model?

Thanks!

This is confusing to me. These are the same numbers from your original post, but with trainable and non-trainable counts being swapped. Are you sure the master branch is being used, because I wouldn’t expect there to be that many parameters. Could you do print(nengo_dl.__version__) inside of your code to check the version?

Yes you are right! I had a calculation error, the numbers are still the same as in my first post.
For my first post I had Nengo-DL version 2.2.1 and now after the installation I have 3.0.0

Sorry for the confusion

The print(nengo_dl.__version__) should show 3.2.1.dev0 to indicate that you are on the current master branch. If it is printing 3.0.0 then that means it’s not picking up the installation from pip install git+https://github.com/nengo/nengo-dl.

I’d also like to make a correction to my first reply. This change was applied in version 3.2.0 and so you don’t need the master branch. You would just need the most recent version (pip install --upgrade nengo-dl or pip install nengo-dl>=3.2.0; see changelog).

It workes now! I had to deinstall and to install nengo_dl again. Now I am using nengo_dl version 3.2.0.
The amount of trainable params is now 1,004,889 and for non-trainable params 0.

Thanks a lot @arvoelke for your help :slight_smile:

1 Like