Implementing CORnet in Nengo with PyTorch

Hi all,

I am fairly new to Nengo and was wondering about implementing this convolutional neural network using the Nengo library. In particular I’m wondering what arrangement of Nengo Connections can be done to create a block of Convolutional and Relu layers, and whether this can be abstracted to create a “connection module”?

I’m leaning on this utility function from the Nengo docs so far:

def conv_layer(x, *args, activation=True, **kwargs):
    # create a Conv2D transform with the given arguments
    conv = nengo.Convolution(*args, channels_last=False, **kwargs)

    if activation:
        # add an ensemble to implement the activation function
        layer = nengo.Ensemble(conv.output_shape.size, 1).neurons
    else:
        # no nonlinearity, so we just use a node
        layer = nengo.Node(size_in=conv.output_shape.size)

    # connect up the input object to the new layer
    nengo.Connection(x, layer, transform=conv)

    # print out the shape information for our new layer
    print("LAYER")
    print(conv.input_shape.shape, "->", conv.output_shape.shape)

    return layer, conv

Hi @davidneuro, and welcome to the Nengo forums! :smiley:

The code snippet you provided can be used to create “blocks” of ReLu layers with a convolutional transform. In particular, if you want to create ReLu neurons, all you have to do is provide neuron_type=nengo.RectifiedLinear() when creating the nengo.Ensemble.

However, looking at the code you link, and your familiarity with PyTorch, I’d recommend checking out NengoDL first. NengoDL allows you to train (and integrate) TensorFlow models within regular Nengo models, and I believe PyTorch has a similar structure to TensorFlow, so this approach will probably be faster than recreating CORnet in vanilla Nengo yourself.

With NengoDL, there are two approaches you can take. You can create and test your model with TensorFlow, and then use the NengoDL converter to convert it into a Nengo model (see here). Alternatively, you can code directly in NengoDL using wrappers around the Keras Layer API (see this example). Both approaches should yield identical results, but if you are already familiar with TensorFlow, the first approach should get you going much faster.

1 Like