Hello,
I was trying to implement this example: https://www.nengo.ai/nengo-dl/examples/lmu.html into nengo_loihi. I also referenced this example: https://www.nengo.ai/nengo-loihi/v0.9.0/examples/mnist_convnet.html. However, I kept getting this error:
BuildError: No neurons marked for execution on-chip. Please mark some ensembles as on-chip.
So can I ask what is the appropriate way to write the lmu_psmnist with nengo_loihi?
Here is the stacktrace:
BuildError Traceback (most recent call last)
in
1 n_presentations = 50
----> 2 with nengo_loihi.Simulator(net, dt=dt, precompute=False) as sim:
3 # if running on Loihi, increase the max input spikes per step
4 if ‘loihi’ in sim.sims:
5 sim.sims[‘loihi’].snip_max_spikes_per_step = 120
~/nengo-loihi/nengo_loihi/simulator.py in init(self, network, dt, seed, model, precompute, target, progress_bar, remove_passthrough, hardware_options)
137 precompute=precompute,
138 remove_passthrough=remove_passthrough,
–> 139 discretize=target != “simreal”,
140 )
141
~/nengo-loihi/nengo_loihi/builder/builder.py in build(self, obj, *args, **kwargs)
221 add_params(obj)
222
–> 223 built = model.builder.build(model, obj, *args, **kwargs)
224 if self.build_callback is not None:
225 self.build_callback(obj)
~/nxsdk09/lib/python3.5/site-packages/nengo/builder/builder.py in build(cls, model, obj, *args, **kwargs)
237 for obj_cls in type(obj).mro:
238 if obj_cls in cls.builders:
–> 239 return cls.builders[obj_cls](model, obj, *args, **kwargs)
240
241 raise BuildError(“Cannot build object of type %r” % type(obj).name)
~/nengo-loihi/nengo_loihi/builder/network.py in build_network(model, network, precompute, remove_passthrough, discretize)
46 discretize_model(model)
47
—> 48 validate_model(model)
~/nengo-loihi/nengo_loihi/builder/validate.py in validate_model(model)
15 if len(model.blocks) == 0:
16 raise BuildError(
—> 17 "No neurons marked for execution on-chip. "
18 “Please mark some ensembles as on-chip.”
19 )
BuildError: No neurons marked for execution on-chip. Please mark some ensembles as on-chip.
Thanks!