Encoding an Image as a nengo.Ensemble

Hi @xchoo, I did some work on MNIST learning using PES rule as also described in this thread. I am just like to ask your opinion on if what I am witnessing is the effects of catastrophic interference/forgetting?

I also made the example into a notebook for documentation of the experiment
catastrophic-forgetting.ipynb (20.5 KB)
It seems like the moment I inhibit the error neurons, the network stop learning and only predicts everything as the last seen sample’s labels.