I can’t answer all of you questions, but a few things that might help:
- Do prevent PES from modifying the decoders, you can ensure that the error signal is 0. How to exactly do that, might depend on your network structure, but it is common to have a neural ensemble providing the error signal. So you could inhibit that ensemble (e.g.,
nengo.Connection(no_learning, error.neurons, transform=-2.*np.ones((error.n_neurons, 1)))). If you want the learning to happen for exactly one timestep, you should also set the
synapse=None on this connection to prevent the inhibition signal from being filtered (however, this might not be biological plausible if you care about that).
- There is pull request for Nengo (#1254) that the possibility to apply a learning rule only every n timesteps. That might cover your use case, but requires that the application is in perfectly regular pre-defined intervals.
pre_tau parameter gives the time constant of an exponential lowpass filter applied to the activities of the presynaptic ensemble (if I recall correctly).
I assume that depends on how quickly your input and error signal are changing. If they are changing on a slower time course it should be approximately the same. If they are change more quickly than the information during those timesteps can contribute in a different way.
This one is correct.