Hello @ji16qewe, welcome to our community. For the following question,
I am pointing you towards two Nengo Core tutorials which show you how to represent
N dimensional data and do calculations on them. Do note that in these tutorials the value of
N=2, but the principle of representing
4D data would remain the same.
Basically, you create
N one dimensional ensembles, e.g.
A = nengo.Ensemble(100, dimensions=1, radius=10) and
B = nengo.Ensemble(100, dimensions=1, radius=10)
(for the value of
N=2) and then create another ensemble for representing
N=2 dimensional data, i.e.
combined = nengo.Ensemble(220, dimensions=2, radius=15)
And then you need to connect them as follows:
You will find more details in the linked tutorials.
Next, with respect to the following:
I am not completely sure if Nengo or Nengo-DL allows spike based training as of now. There’s
KerasSpiking which allows you to do spike aware training. Specifically, it does spike based forward pass and non-spike based backward pass (to propagate the errors back) during training phase.
With Nengo-DL you can have an end-to-end spiking network - with room for some exceptions like MaxPooling op, unsupported neuron types e.g. softmax etc. which do not have a spiking counterpart. How you can do training and inference in Nengo-DL is as follows. You will first train your network with non-spiking neurons (e.g. ReLU) and then convert your model to spiking one by using
nengo_dl.Converter() API and then proceed with inference. An excellent tutorial on training and inference with Nengo-DL is here.
With respect to following about training and inference with 4D data:
you will find how it accepts a 3D data in the above linked Nengo-DL tutorial. Basically, you create your usual TF network architecture to accept the
N dimensional data and then while working in Nengo-DL context, you flatten your data before passing it to the Nengo-DL simulator functions.
If your task is to do deep learning, I would suggest you to look at Nengo-DL and KerasSpiking, evaluating both which suits you best.