Memory errors inspite of very small dataset in nengo_dl

You specified that your output is 1 dimensional (x = nengo_dl.tensor_layer(x, tf.layers.dense, units=1)), so it is expecting the target data to be 1 dimensional.

1 Like

how can i improve the results? If i run a dataset which is composed of 5000 train and 1000 test images of 227x227 in this nengo dl model. What are thr minimum resources to run such dataset will be required.i have light weight system which have 6GB RAM 2.4GHZ processor.

A memory error means that your program has ran out of memory. If you get an unexpected MemoryError and you think you should have plenty of RAM available, it might be because you are using a 32-bit python installation. This could be caused by either the 2GB per program limit imposed by Windows (32bit programs), or lack of available RAM on your computer. The easy solution, if you have a 64-bit operating system, is to switch to a 64-bit installation of python. The issue is that 32-bit python only has access to ~4GB of RAM. This can shrink even further if your operating system is 32-bit, because of the operating system overhead.