Saving the model after training in Nengo Core

Hi,
I am using the classification example “Encoding Image Recognition”:
https://www.nengo.ai/nengo-extras/examples/mnist_single_layer.html
and would like to save the model after training and then use it for inference on a single test image at a time.

In the model definition:
with nengo.Network(seed=3) as model:
a = nengo.Ensemble(n_hid, n_vis, **ens_params)
v = nengo.Node(size_in=n_out)
conn = nengo.Connection(
a, v, synapse=None,
eval_points=X_train, function=T_train, solver=solver)
I can save the weights (conn), but not sure how to save the model, because the ens_params contain the training data (X_train). Any help will be very much appreciated!

Thank you!
Regards,
Krishna

The standard way to serialize a Nengo model is to use Python’s pickle library. You can use pickle.dump to save the sim object, and pickle.load to load it.

I’m curious how long the training in that example takes for you. The training (or, in our case, optimization) occurs when you nengo.Simulator. Does that take a long time for you? The time spent doing sim.run is not doing any training, just inference, so you can’t save any of that cost. I suspect that pickling/unpickling the simulator will not save much time compared to running the model normally.

Also, you might be interested to know that the example you’re looking at is quite old at this point, and the way that we currently recommend doing deep learning with Nengo is through the NengoDL package. You can see the spiking MNIST NengoDL example which does something similar. NengoDL also has a nicer interface for saving and loading models, the save/load_params methods.

Hope that helps!

Thank you for the detailed reply! The notes are very helpful. This classifier takes about 6 seconds (for each of the autoencoder version) both for training and testing (2500 images of 64x128 pixel size) on HP Zbook laptop. This is really fast. The interest in this classifier is to use to detect objects in a fixed size window search in video data playback. I am trying to stick to Nengo Core, because we want to implement this classifier on Loihi at some point in the future. I am not sure if the code written in NengoDL can be processed on Loihi, some comments on this would be helpful!

Yes, you can train a model with NengoDL and run it with NengoLoihi. This Loihi example uses weights that have been optimized using NengoDL (though that training process is not shown in that example – we’re working on an example right now that shows the whole training on NengoDL and inference on NengoLoihi process).

1 Like

Thank you a lot! This is great to know - that a model trained in NengoDL can be run in NengoLoihi! I got some idea from the example you linked!
I will be looking forward to the next one to learn both training and inference.
Best Regards,
Krishna

In stead of saving the model (the main question above), I am able to create a model (without any call to the training data ) using pre-defined weights, simulate it and call the sim in a loop for inference:

with nengo.Network(seed=3) as model:
    a = nengo.Ensemble(n_hid, n_vis, **ens_params)
    #use the solved weights
    weight_matrix1 = np.load('weights1.npy')
    
    #read the weights for encoder
    encoders = Gabor().generate(n_hid, (128, 64), rng=rng).reshape(n_hid, -1)
    a.encoders = weight_matrix1   #encoders
    tile(encoders.reshape(-1, 128, 64), rows=4, cols=6, grid=True)
    
    #used the solved weights
    wm2 = np.load('weights2.npy')
    weight_matrix2 = wm2.transpose()
    c = nengo.Ensemble(2, 1)
    conn2 = nengo.Connection(a.neurons, c.neurons, transform=weight_matrix2)
    
    #simulate the model 
    sim = nengo.Simulator(model)

Here is that example that tbekolay had mentioned about converting a network trained in nengo_dl to run on nengo_loihi.

https://www.nengo.ai/nengo-loihi/examples/keras-to-loihi.html

I’m going to mark tbekolay’s first answer as the solution as the topic of running a nengo-dl model on nengo-loihi is out of the scope of the initial question. If you have any issues please feel free to make a new forum post.

Cheers