Full-FORCE tutorial issue

Hello! I am new to nengo and would appreciate some help on the following tutorial here for implementing the full-FORCE algorithm on a set of Izhikevich neurons (even though in the tutorial LIF neurons are used, I have managed to set the neuron type to be Izhikevich).

However, I am having trouble to set the output I want to learn. I have a (1,N) array, let’s call it “data”, with the values of the signal I would like to try and model.

Could someone pinpoint the exact points in the tutorial code in the link I provided so that can do this? I have tried with no success.

Thank you!

Welcome! Thanks for taking a look and trying to extend my tutorial. The input data is provided via nengo.Node, specifically this line:

    u = nengo.Node(output=nengo.processes.PresentInput(U, dt))

You can replace U with a 1D vector representing the input time-series, and dt with the amount of time that you would like each element of the vector to be presented (where dt corresponds to the length of one time-step).

Likewise the target data is provided here:

    z = nengo.Node(size_in=1)
    nengo.Connection(u, z, synapse=nengolib.synapses.Bandpass(freq, decay))

you can replace z with whatever target data you want it to learn based on the given input signal. Again z is a nengo.Node so if your target data is another array you can do:

z = nengo.Node(output=nengo.processes.PresentInput(target, dt))

and then remove the Bandpass connection (since your target would no longer be a decaying oscillation, as in the example).

If you try something out and it doesn’t work please let us know the specifics of the code and the details of the problem. Thanks!

1 Like

Hello, and thank you for the reply!

I have tried what you suggested, but it throws a validation error. I have managed to make it compile in a different way shown below but then I encounter another error when I try to run the simulator:

The data

raw_data = {}

Make it into dictionary

for i in range(data.shape[1]):
raw_data[dt*i] = data[0,i]

with Network(seed=seed) as model:
# Input is a pulse every pulse_interval seconds
U = np.zeros(int(pulse_interval / dt))
U[0] = amplitude / dt
u = nengo.Node(output=nengo.processes.PresentInput(U, dt))

# Desired output
z = nengo.Node(size_in=1,output=Piecewise(raw_data))
nengo.Connection(u, z)

Initial weights

e_in = g_in * rng.uniform(-1, +1, (n, 1)) # fixed encoders for f_in (u_in)
e_out = g_out * rng.uniform(-1, +1, (n, 1)) # fixed encoders for f_out (u)
JD = rng.randn(n, n) * g / np.sqrt(n) # target-generating weights (variance g^2/n)

Now it fails when I try to run the simulator:

~/.local/lib/python3.6/site-packages/nengo/processes.py in make_step(self, shape_in, shape_out, dt, rng)
425 def make_step(self, shape_in, shape_out, dt, rng):
426 tp, yp = zip(*sorted(iteritems(self.data)))
–> 427 assert shape_in == (0,)
428 assert shape_out == (self.size_out,)
429

AssertionError:

I think you can remove

nengo.Connection(u, z)

since your z no longer requires the input of u. It did in the tutorial because z was a bandpass-filtered version of u.