Learning appropriate tuning curves


Hello everybody,

I’m new to Nengo and trying to calculate a simple division equation (for example 1/x) with an two-dimensional ensemble. Unfortunately the results are not satisfiable.

I’ve already found some older topics that describe improvements for exactly this problem.

I tried to solve the issue with the inexact results on my own by manipulating the intercepts and encoders but this didn’t led to an acceptable solution.

The learning example led to a quite good solution.
Now I’m wondering if it is possible not only to learn the weighting matrix of the connection but also the tuning curves of the ensemble.

I hope this isn’t an awkward question and I am thankful for any help or idea.


You can use NengoDL to optimize the tuning curves of neurons. Here is a basic example showing what the process looks like; in your case you’d be learning a division function instead of the function $f$ in that example.

Note that the tuning curves are a product of the gains and biases, in general, plus any parameters specific to that neuron type (e.g., the RC time constant for LIF neurons). NengoDL will only optimize the gains and biases, not the specific neuron parameters. There’s nothing preventing the same techniques from being applied to those parameters in theory, I just haven’t enabled that yet because in the use cases I’ve seen people want to keep those parameters fixed. But if you wanted to work on something like that I could get that going pretty quickly.

Also note that, by default, NengoDL will be optimizing the connection weights as well, not just the tuning curves. If you wanted to only optimize tuning curves and not connection weights (just using the default Nengo weight solvers), you could add

with nengo.Network() as my_network:
    my_network.config[nengo.Connection].trainable = False


Thanks a lot for your detailed answer. I will experiment with this simulator - it sounds promising :+1: