Non-differentiable network optimization in Nengo

Hi @gonen!

NengoDL is an extension to Nengo (what we call a “backend”) that uses TensorFlow under the hood to represent all operations, and thus allows gradient descent across the whole network.

You are correct that this is less biologically plausible. For biological plausibility, we recommend the features in core Nengo. We have a number of different tutorials on learning that use biologically-plausible local learning rules, including PES (which is error-based) and BCM and Oja (which are Hebbian). If you’re interested in running on a GPU, you can still use NengoDL to simulate these models, but you don’t have to use NengoDL’s support for gradient descent.

As for other methods, some people have looked at using genetic algorithms with Nengo, though we do not have built-in support for this. And if you’re interested in deep networks specifically, I’ve investigated Feedback Alignment in Nengo in chapter 6 of my thesis; it worked well for MNIST, but I found (as others have also found) that it doesn’t scale well to more challenging datasets. (Code for MNIST using Feedback Alignment in Nengo is here, but be warned that it’s a bit old now, so it might not work with newer versions of Nengo out of the box.)

2 Likes