Error in nengo_dl.layer

conv1 = nengo_dl.Layer(tf.keras.layers.Conv1D(filters=16, kernel_size=3))(inp)

when I use this code, it always tells me:

TensorNode.tensor_func: Attempting to automatically determine TensorNode output shape by calling Layer.compute_output_signature produced an error. If you would like to avoid this step, try manually setting TensorNode (., shape_out=x)

But if I use:
conv1 = nengo_dl.Layer(tf.keras.layers.Conv2D(filters=16, kernel_size=3))(inp)

It will not find error.
What’s wrong with this? I tried a lot according to the error. But made no result.
So nengo_dl can’t use keras.layers.Conv1D?
I don’t think so. It doesn’t make sense.

Anyone encountered this?
Thanks!

Hi @Evian,

It’s a little hard to tell what is causing the error without the full context of your code. To answer your question though:

NengoDL supports the Conv1D layer for sure. However, the Conv2D and Conv1D layers expect differently shaped inputs. The Conv2D layer convolves the input with a 2D kernel, so the input needs to also be at have two dimensional axes. Similarly, the Conv1D layer convolves the input with a 1D kernel, so the input needs to only have one dimensional axis. My suspicion then, is that you have not correctly modified the shape of the inp layer to work with the respective convolution layer shapes.