Memory errors inspite of very small dataset in nengo_dl


You specified that your output is 1 dimensional (x = nengo_dl.tensor_layer(x, tf.layers.dense, units=1)), so it is expecting the target data to be 1 dimensional.


how can i improve the results? If i run a dataset which is composed of 5000 train and 1000 test images of 227x227 in this nengo dl model. What are thr minimum resources to run such dataset will be required.i have light weight system which have 6GB RAM 2.4GHZ processor.