Hi @Adri, and welcome to the Nengo forums.
To answer your questions:
That’s correct. To implement a custom synaptic model, you’ll want to make a subclass of the nengo.synapses.Synapse
class.
Implementing a custom learning rule that works with your custom synapse is a little bit more complicated. The learning rule implementation (actually, almost all Nengo objects are implemented this way) consists of 3 parts:
- The interface code
- The builder code
- The operator code
The interface code is what the Nengo user uses to create and configure the Nengo objects. This is what is contained in the nengo.learning_rules
module, and the subclasses of nengo.learning_rules.LearningRuleType
are variants of the generic Nengo learning rule interface. In this interface class, you’ll want to provide a mechanism with which you can store the user’s configuration of the learning rule (e.g., what the learning rate is, or what the synapse values are, etc.)
The learning rule builder code is what Nengo uses to take the learning rule parameters and builds the corresponding Nengo objects and Nengo operators used to implement said learning rule. The learning rule operator code is what Nengo actually runs when the simulation is running (i.e., these are the functions that are called to modify the parameters used by the learning rule). Both the builder and operator code for built-in learning rules can be found within the builder/learning_rules.py
file.
To construct your custom learning rule, you’ll need to have all 3 components. You can see an example of how a custom learning rule is implemented in this post.
By default, the only variables that can be updated with a learning rule are the encoders, decoders and weights. It should be possible, however (caveat, I haven’t tested it, so I’m not 100% sure), to write your builder and operator functions to work on the synapses instead.
This depends on your implementation of the delay synapse. The operator object for Nengo synapses (the SimProcess
operator) calls whatever function that is provided to it by the make_step
function of the synapse class. If you configure the step function to use attributes of your delay synapse class, then modifying those attributes in your learning rule should (once again, haven’t tested it…) also cause the behaviour of the synapse to change.
If I get the chance to test this type of custom learning rule, I’ll post a reply to this thread.