Unreliable circular convolution of two 32-dimensional vectors


#1

Hi guys,

I had been getting some weird answers from my implementation of a model that does circular convolution with images and labels so I went back to basics, i.e. the circular convolution tutorial [here] (https://pythonhosted.org/nengo/examples/convolution.html), just adapting it to use arbitrary random vectors instead of the spa Vocabulary module. The output I get from the convolution network is sometimes close to the ground truth circular convolution but often it is not, and this changes from simulation to simulation.
Here’s the code to reproduce what I’m seeing:

import numpy as np
import nengo
from matplotlib import pyplot as plt

inputsize = 32

A = np.random.rand(inputsize)
B = np.random.rand(inputsize)
# just the ground truth circular convolution of A and B
C = np.real(np.fft.ifft(np.fft.fft(A)*np.fft.fft(B)))

model = nengo.Network()
with model:
  nodeA = nengo.Node(output=A)
  nodeB = nengo.Node(output=B)

  cconv_ens = nengo.networks.CircularConvolution(200, dimensions=inputsize)

  nengo.Connection(nodeA, cconv_ens.A)
  nengo.Connection(nodeB, cconv_ens.B)

  # Probe the output
  out = nengo.Probe(cconv_ens.output, synapse=0.03)

with nengo.Simulator(model) as sim:
  sim.run(1.)

plt.plot(sim.trange(), nengo.spa.similarity(sim.data[out], [A, B, C], normalize=True))
plt.legend(['A', 'B', 'C'], loc=4)
plt.xlabel("t [s]")
plt.ylabel("dot product between circular convolution out and A, B, and C");
plt.show()

So just now I ran this and the first time the output was:


Not what i’m looking for. I ran it again and the output was

Now that’s more like it. Let’s try it a third time:

Hmmmm. This happens whether I’m on Nengo 2.1.2 or 2.2.0

Can you reproduce this and help me troubleshoot? Seems like maybe there’s a really simple answer I might be missing.

Spencer


#2

Hi Spencer,

the problem is the length of you input vectors. Ensembles in Nengo are optimized for a certain input radius. Outside of this radius the neurons will saturate and given an inaccurate representation of the input. To get the example working you can do either of the following things:

  1. Set the input_magnitude argument of the CircularConvolution higher. A value of 4. seems to produce decent results.
  2. Normalize your vectors to unit length (e.g. A /= np.linalg.norm(A)).

Also note that np.random.rand gives you a uniform distribution over the range 0 <= x < 1. That means your vectors will not have any negative components. Maybe that is what you want, but it is more common to use a normal distribution centered around 0 to get positive and negative components. For example np.random.randn(inputsize) / np.sqrt(inputsize) will give you vectors that are close to 1 in length (note the n in randn).