Convert Unet to SNN

I convert Unet model to SNN to increase the accuracy but the problem occurs. I don’t know what is the reason when I train the period model with 4 milion parameters such as VGG or Alexnet ( even so more 4 milion), no problem but my UNET with 1.809 milion parameter, the crash (overload RAM) occured. Can you help me, please? This is my code (UNET model)? And I see tf.keras.layers.Cropping2D and tf.keras.layers.ConvolutionTransform2D can not convert to native Nengo Object so the accuracy is affected, is’n it?
UNET_SNN_Nengo.py (5.2 KB)

While TensorFlow memory is related to the number of parameters, there are also other things that use TensorFlow memory, such that a model with fewer parameters doesn’t necessarily use less memory.

Probably the easiest thing to try is using a smaller minibatch size. It looks like you’re using 200. You could try 128, and if that doesn’t work try 64, 32, 16, 8, etc. until you get one that works. (I prefer powers of 2 because they often run faster on GPUs.)

The fact that some layers can’t be converted to native Nengo objects won’t necessarily be detrimental to accuracy in NengoDL. One main downside is that it may make it impossible to run the network on other backends, e.g. NengoLoihi.

Thanks for your response. Can you answer my second question that is tf.keras.layers.Cropping2D and tf.keras.layers.ConvolutionTransform2D can not convert to native Nengo Object so the accuracy is affected, is’n it?

I did:

1 Like