Nengo `Connection`s mapped to Loihi H/W

Hello there!

I have a network that I intend to deploy on Loihi as well as make the Connections’ weights learnable through a rule specified in the NxSDK API format (e.g. dw = x1 x y0 - y1 x x0 etc. as in NxNet tutorials on INRC). For the same, I am using the nengo.Connection object to access the connection mapping on Loihi (not sure if this is the right way to effect my learning rule) using the following code:

loihi_sim = nengo_loihi.Simulator(net) # net is my sample network.
board = loihi_sim.sims["loihi"].board
synapse = loihi_sim.model.objs[conn] # conn is the nengo.Connection object.
board.find_synapse(synapses)

and it throws following error:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-9-5f020689a243> in <module>
      1 synapses = loihi_sim.model.objs[conn]
----> 2 board.find_synapse(synapses)

~/nengo-loihi/nengo_loihi/hardware/nxsdk_objects.py in find_synapse(self, synapse)
     85 
     86     def find_synapse(self, synapse):
---> 87         return self.synapse_index[synapse]
     88 
     89 

TypeError: unhashable type: 'dict'

Upon looking at the loihi_sim.model.objs output, I see that it has

.
.
.
<Connection at 0x7f9c47834b50 from <Node (unlabeled) at 0x7f9c47834910> to <Neurons of <Ensemble (unlabeled) at 0x7f9cd4fb9700>>>: {}})

i.e. an empty dictionary as the value of the conn object.

Can someone help me how to access the nengo.Connection object on Loihi and specify a custom learning rule? Thank you for your time!

I think the problem is you can’t have a learning_rule_type on your connection. The only learning_rule_type we support is PES, and with that we do some funny stuff internally to pass the error in. (We actually split the Connection object into two, so when you try to look up the model objects for your original conn object, it returns an empty dict.)

If you don’t have a learning rule, then this works as expected. Here’s a minimal example:

import nengo
import nengo_loihi

with nengo.Network() as net:
    a = nengo.Ensemble(100, 1)
    b = nengo.Ensemble(100, 1)
    conn = nengo.Connection(a, b)

with nengo_loihi.Simulator(net) as sim:
    board = sim.sims["loihi"].board
    nxsdk_board = sim.sims["loihi"].nxsdk_board
    synapse = sim.model.objs[conn]["decoders"]
    assert isinstance(synapse, nengo_loihi.block.Synapse)
    chip_idx, core_idx, syn_idxs = board.find_synapse(synapse)
    nxsdk_chip = nxsdk_board.n2Chips[chip_idx]
    nxsdk_core = nxsdk_chip.n2CoresAsList[core_idx]
    # TODO: set up the learning here

Looking more at what would be involved for setting up that learning, though, and it seems like this might be difficult to hack in the way that you’re proposing. You can see in the hardware builder where we set the learning rule itself, but if you search through that file for synapse.learning, you can see all the other changes that need to be made for a learning synapse, and it might be difficult to do those all post hoc.

I think the better way to go is to more properly build your learning rule into the builder. So for example here, where we make modifications for the PES learning rule, you’d add your learning rule. There’s still going to be a number of changes that need to be made throughout the codebase, since right now we basically assume a PES learning rule for all our learning.

The third way to do it, which is perhaps the most hacky but probably also the quickest, is to find a way to get the actual Connection object that does the learning (after we break that original connection into two). This works for my basic example (but would break e.g. if you have two connections coming out of a):

import nengo
import nengo_loihi

with nengo.Network() as net:
    a = nengo.Ensemble(100, 1)
    b = nengo.Ensemble(100, 1)
    conn = nengo.Connection(a, b, learning_rule_type=nengo.PES())

with nengo_loihi.Simulator(net) as sim:
    (new_conn,) = [
        obj
        for obj in sim.model.objs
        if isinstance(obj, nengo.Connection) and obj.pre is a
    ]
    synapse = sim.model.objs[new_conn]["decoders"]
    assert isinstance(synapse, nengo_loihi.block.Synapse)

    board = sim.sims["loihi"].board
    chip_idx, core_idx, syn_idxs = board.find_synapse(synapse)

    nxsdk_board = sim.sims["loihi"].nxsdk_board
    nxsdk_chip = nxsdk_board.n2Chips[chip_idx]
    nxsdk_core = nxsdk_chip.n2CoresAsList[core_idx]

    # TODO: set up the learning here

Hello @Eric ! Thank you for a quick response and the code snippets! I have the following sample code (over which I will be building my actual network), where I would like to implement learning rules (similar to the one in original question).

with nengo.Network() as net:
    node = nengo.Node(output = lambda t: 1 if int(t*1000)%4 == 0 else 0.0)
    ens = nengo.Ensemble(n_neurons=1,
                         dimensions=1,
                         encoders=[[1]], # Shouldn't matter since a direct connection to neurons is made.
                         intercepts=[0],
                         max_rates=[250],
                         neuron_type=nengo_loihi.LoihiSpikingRectifiedLinear(initial_state={"voltage":np.zeros(1)})
                        )
    conn = nengo.Connection(node, ens.neurons, synapse=None, transform=2)

Since I will be working with pre and post synaptic traces - as you can obviously see from the example learning rule, I thought to connect directly to the neurons in an Ensemble. Following were my intended next steps (based on my best understanding of NengoLoihi and Loihi’s NxSDK coupling).

  1. Access the neurons block through board.find_block(block) and then set the required NxSDK attributes e.g. vThMant etc. (I actually need to enable spike backprop for the post trace, which I guess, I can do via setting enableSpikeBackprop attribute).
  2. Access the conn object and find its equivalent/mapping on Loihi through board.find_synapse and set the learning rule in dw, time constants e.g. x1TimeConstant etc. (I am assuming that NengoLoihi doesn’t allow to directly access these attributes and set dw.
  3. Execute the network and analyze.

From your previous reply (and a bit of search), it appears that PES works for optimizing decoders (same for RLS), Voja works on optimizing encoders, and BCM and Oja works with pre/post synaptic activities (which can probably be filtered to get pre/post synaptic traces). I am completely unaware of how all these learning rules work. If you find a similarity in any of the above and the dw learning rule I have mentioned, can you please point me to the right direction? (I guess BCM is closest to the learning rule mentioned in dw).

Also, I see that in your first code snippet, you are making a connection between Ensembles whereas I am making a direct connection to neurons from a Node (as a pre-object). I guess, this is the reason why the return of loihi_sim.model.objs[conn] is still an empty dictionary in my case. The builder code you linked gives me some glimpse of how setting my learning rule could look like, but the link to the modification of PES didn’t give me any hints on where I can mention by dw learning rule.

I guess, a simple example of how I can incorporate the dw=x1 x y0 - y1 x x0 in my above code snippet (in this reply) such that the direct connection to the neuron in ens would learn a weight, would be very helpful. Can you please give me some pointers here?

I’ve been thinking a bit more about this, and rather than trying to hack it in by using board.find_block and commands like that, it might actually be easier to more properly build it in to NengoLoihi itself. For some idea of how to do that, the place to start would be to look at the PES rule and how it’s implemented. Now, it’s a lot more complicated, because it requires an external error signal, so we have a lot of machinery associated with getting that error signal in (including a custom SNIP). Since the rules that you’re looking at don’t require that, it should be simpler.

To take this approach, I think there are three significant changes you’d have to make:

  • Modify the build_full_chip_connection function to support your rule type (add an elif statement to the block that starts if isinstance(rule_type, nengo.PES):).
  • Modify the classes in block.py to store the learning parameters you require. You’ll see that for PES, we have a Synapse.set_learning function which we call in the connection builder. You’d probably want a similar function on Synapse that you can call for your learning rule type.
  • Modify the functions in hardware/builder.py to build things correctly to NxSDK. Right now, most of the action happens in build_core. You’ll see that we basically assume all learning is PES, doing things like counting up all the learning synapses and then treating them all as PES, and having the stdpUcodeMem entries hard-coded for PES. So you’ll need to change these to support your learning rule as well.

If you’re really just interested in getting your rule working and don’t need PES, rather than making the above modifications to support both PES and your rule, you could do them in such a way that you basically just replace PES with your rule. This might be a bit easier.

Sorry that I can’t give you more specific advice or examples. We’ve only ever implemented PES, so we don’t have examples sitting around of how to do other rules.

My final piece of advice would be to start in NxSDK and do a very basic example of your learning rule (with a handful of neurons only), just to get an idea of how learning on Loihi works, and what NxSDK commands you need to be using.

Hello Eric! Thanks for getting back on this… I was caught up in some other work… now I am gradually catching up to this. I have already implemented my learning rule in NxSDK and it works as expected. WRT your suggestion on:

given that it might be easier as you stated, is the way to do it similar as introducing another rule alongside the PES? i.e. following your advice on the three significant changes? Or if the starting pointers are different, can you please let me know?