Nengo DL converter

Hello @msanch35! You are correct, if you use activation="relu" and then swap it with following statement:

swap_activations={tf.nn.relu: nengo.SpikingRectifiedLinear()}

it won’t swap it. Rather you should use the following:

swap_activations={tf.keras.activations.relu: nengo.SpikingRectifiedLinear()}

while using “relu” in non-spiking network. This is adapted from Conversion of sequential model.

1 Like