Learn synaptic delays

Hi @xchoo, thank you for your detailed answer, much appreciated !

I just find out that implementing delays (instantiated as a buffer) in a Synapse object would cause big efficiency issues. As the synaptic connections are handled pointwise (1 pre to 1 postsynaptic neuron), I won’t be able to take advantage of numpy operations for the synaptic delay buffer. Instead I would have to instantiate and update a buffer for each connection, and this would be catastrophic for a fully connected network.

Here is some non-nengo code to instantiate a buffer for the delays and to update them, taking advantage of vectorized operations :
cylindrical_buffer.ipynb (78.3 KB)

Thus my question is : is it possible to bypass the use of synapses to implement this buffer so that it can benefits from vectorized operations on the whole set of connections ?
In this post you mentioned the signal flow :

signal input → \times connection weight → synapse applied → neuron non-linearity → signal output

What I would like to do is :
signal input → buffered (delayed) signal → \times connection weight → neuron non-linearity → signal output

Or alternatively :
signal input → \times connection weight → buffered (delayed) signal → neuron non-linearity → signal output

In addition to the vectorization problem, I would also need to have access to the delays to update them with an array of Delta_delays. Would it still be possible to implement a learning rule (interface +builder + operator) for a non-synaptic object ?

That would be really great, thanks !