@xchoo Thank you for your responce.
I am using Dense layer with the default activation which is linear not ReLu. Nevertheless, I checked with different activations, but the result was not matching.
Furthermore, what I observed that if I replaced the convolutional layer with the dense layer and donot use the LIF layer afterwards. Then I get the correct mapping after the second dense layer with LIF. You can see it in the attached notebook file.
Test_1.ipynb (82.1 KB)
This means that there is something wrong with the connection weights assignment to the neuron? or I am doing something wrong else where?
However, if I do the same with the convolutional layer i.e., Do not use LIF after convolutional layer. I do not get any match. You can see it in the attached notebook file.
Test_2.ipynb (84.1 KB)
How can I fix this behavior? Any suggestions?
Thank for your response in advance.