If you specify activation="relu"
, under the hood that is mapped to tf.keras.activations.relu
. So you would need to set up your activation swap like swap_activations={tf.keras.activations.relu: nengo.SpikingRectifiedLinear()}
(rather than {tf.nn.relu: nengo.SpikingRectifiedLinear()}
).
1 Like