Indexing of ensembles for processing macroblocks

Dear all, there raised one question about indexing of neurons in ensemble when I try to define unsupervised learning framework using Nengo.

Given we are gonna define a small ensemble of 4x4 neurons and have it process one macroblock of 4x4 pixels,does the principles of Nengo defaultly apply a sequential indexing ? Saying, the first neuron process the (1,1) pixel, and so forth …The example of unsupervised learning shows the input as sin wave, I suppose this give all the neurons in the ensemble a same but duplicated source as input. what if the input is a vector or array ?

Also , if the learning method is Oja rule, then decay term for the output are always the same; but if the situation is output neurons have different decay term, like GHA (generalized hebbian learning), from which each output neuron has different decay term (product of indexed input and indexed output) which depends on the indexing of output neuron,
then we have to define a built-in learning function ?

I have checked this problem with example : cuda-convnet2, however I donot total understand the code. Reorient me to any existing example if that can save time.

Many thanks.

Hello,

It is difficult for me to answer your questions without knowing what knowledge you already have. Consequently, before I attempt to answer your questions, please let me ask a bit about yourself first.

How much of Nengo do you understand so far? Have you gone through all of the tutorials in the Nengo GUI? What type of background do you have?

Hi Seanny123, Thanks for reading my question.
I was attempt to create neural framework via Nengo 1.4, I have gone through the tutorials in Nengo1.4. But I found Nengo 2.1 support better unsupervised learning. I am doing with the examples from Nengo 2.1 now for understanding the difference between Nengo2.1 and Nengo1.4. To my best knowledge of Nengo2.1, I can configure by setting parameters of encode/decode or transform across ensembles to achieve indexing neurons. But for neurons which are indexed in an ensemble to process marcoblocks as input date set, I am still not very clear about it. This is my master proejct of research on the motion estimation via artifitial neural network.

I do believe that Nengo2.1 could support CNN as it already have, seem unsupervise learning should not big problem unless some new built-in interface. My background is Embedded-systems, have learned neural sciece via other simulation tools (matlab,Emergent) half year ago.

I got some sense about how to manipulate the parameters to this question, respectively:
First way is splitting a macroblock into 16 inputs and feed them into an ensemble of one 16 dimensions neuron. (Seem this obey the original purpose of my question)
Also another way is set encode to assign different elements of array of 16 into ensemble of 16 neurons, saying, direct neuron type.
Last promising way is doing connect the input macroblock with ensemble , as the cuda-convnet example (line82-83) does :
e = nengo.Ensemble(n, 1, label=’%s_neurons’ % name)
nengo.Connection(input0, e.neurons)

So, any advices to the right way by avoiding climbing the wrong tree ?

One important issue here might be what we consider unsupervised learning… the unsupervised learning rules in Nengo (BCM and Oja) are not the same type of unsupervised learning rules that convolutional neural networks use. I think I would consider rules like autoencoders etc as “self-supervised” rather than unsupervised. Getting the type of performance that CNNs get takes a fair bit more work; see @Eric’s work on that at http://arxiv.org/abs/1510.08829.

If you’re mostly interested in CNN type learning rules and higher-level vision problems, then you might want to look into other Python libraries like Lasange. In addition to the spiking deep learning work by Eric, we have also a related line of research in integrating traditional deep learning type networks like CNNs with Nengo; see the nengo_deeplearning project.

What kind of model are you looking to produce in the end? If you can provide some more specifics on the types of algorithms you’re planning to use for motion estimation then we can hopefully point you to the right resources.

Hi @tbekolay
The neural model I try to create is actually a hybrid neural network, similar to CNN. The network abstracts low-level features via front unsupervised layers, and pass these features for further extraction. In the picture the green bounding box are the parts which employs (generalized) hebbien learning algorithm as training method. Each small 2x2 green neuron-ensemble stores pixels from a 2x2 macroblock and red neurons finally output converged features after lots of iterations of training. The cyan-color parts receive these spatiall based feature for supervised training.

So this network is partially similar to CNN since it has some ensemble like filter layer. however difference is the fronts parts are feedforward to generate nature output and the end cyan parts are depend on a expection to manipulate parameters.

This network model needs libraries that allow arbitrary or selectively connection,like cluster. Also customing certain layers with flexibility for the learning objective and methods are gonna a problems with some neural network tools like MATLAB. I am reading Lasagne and feel its style is really close to Matlab where the neurons can be treated as matrices. However a powerful GUI problem left headache to me . The Lasagne indeed meets customing and treating weighted connection as matrices but it hasn’t a powerful GUI for a immediate porting effect as Nengo does.

I feel that the work I am doing require that Lasagne has a GUI like Nengo does, or, either create a hybrid network with Nengo by some dump ways (like access weights and calcualte them as matrix, regardless of learning method Nengo alread has). This is dilemma for me …