For a task i needed to compute a derivative of output F(x) given weights W ( ∂F(x,w))/∂w ), is there any way to get the derivation matrix in Nengo ?
The core of Nengo doesn’t have any tools for calculating Jacobian matrices, but since NengoDL is built on top of TensorFlow, it can do automatic differentiation to find Jacobians. There is some discussion about how to do that in this thread. It would end up looking something like
gradients = tf.gradients(sim.tensor_graph.probe_arrays[p], sim.tensor_graph.input_ph[x]) dydx = sim.sess.run(gradients, feed_dict=sim._fill_feed( n_steps, training=True) )
but the specifics would change depending on the network you have set up. The derivative isn’t exactly the Jacobian, though it’s close; there is also a Jacobian function in TensorFlow, which might prove useful. The first step will be to set up a network computing F(x) that works with NengoDL, so if you go that route start there and we can help more once we have the full network details.
How can I calculate that inside of a custom learning loop in nengo_dl?
If you’ve got a list of weight tensors that you want to update, you can apply a list of gradient tensors to that list of weight tensors with:
optimizer.apply_gradients(zip(gradients, weights), experimental_aggregate_gradients=False)
optimizer is an instance of a
You can find the tensors for weights in
sim.tensor_graph.trainable_weights), though they’re not labelled at all there, so you might need another way to get a handle on the weights you want to update.
Also note that NengoDL merges some operators by default, which you may want to turn off to make it easier to get the tensors you want (so they don’t get squashed together). You can turn off all simplifications with