I sorta asked this in another thread, but that thread got sidetracked, so …
I have a model that inputs one word at a time, eg., JOHN KICKED BALL PERIOD SUSAN ATE SUCHI PERIOD WHO KICKED BALL QUESTION.
What I want to happen is that when PERIOD comes in, the preceding phrase is tied to a pointer and added to memory, so after the above words memory would contain
POINTER1*(JOHN + KICKED + BALL) + POINTER2*(SUSAN + ATE + SUSHI).
When QUESTION comes in, I want to retrieve the pointer via
and then use that pointer to retrieve the phrase via
So ideally, POINTER1 would be generated and used while PERIOD is the input. Currently I have to force it to use a pre-generated pointer.
So, any way to do what I want with nengo_spa?
There is no easy way to generate a pseudo-random Semantic Pointer in the network. You probably have to implement a pseudo-random number generator in neurons.
What you can do instead: Start with one Semantic Pointer X and bind to another pointer . Then for the next pointer bind it again to X and so on (you get X * Y^n). Note that with purely random vectors X * Y^n will exponentially grow in length. Thus, you will need to normalize in each step or (better) use a unitary Y (i.e. a Y that does not change the length of X). The
nengo_spa module provides an easy way to add an unitary vector to a vocabulary:
Also note, that for sufficiently large n
X * Y^n (with unitary Y) will produce vectors similar to vectors produced for smaller n (i.e. you will not get an infinite number of dissimilar vectors). Actually all of these vectors should lie on a hyper-circle.