Bidirectional-LMU, keras Bidirectional layer equivalent

Hello,

I just got started with Nengo, and I am messing around with the Nengo-DL LMU examples and i wanted to try using Keras with NengoDL to achieve implementation of bidirectional LMU.

I hoped that it would be as easy as using

https://keras.io/api/layers/recurrent_layers/bidirectional/

with

nengo_dl.TensorNode as shown here

https://www.nengo.ai/nengo-dl/examples/tensorflow-models.html#Inserting-Keras-layers

but it doesnt seem to be the case, what is the intended way to achieve something like this?

Maybe I could just add another LMUCell that works with reversed input, basically implementing the bidirectionality myself?

Thx in advance

Hi ondrysak,

Bidirectional layers should work with TensorNodes. When you say “it doesnt seem to be the case”, could you be more specific about what isn’t working the way you’d expect?

Here is a simple example demonstrating how it might work

import nengo
import nengo_dl
import tensorflow as tf
import numpy as np

train_images = np.zeros((32, 4, 1))

with nengo.Network() as net:

    inp = nengo.Node(np.zeros(np.prod(train_images.shape[1:])))

    h = nengo_dl.Layer(tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(units=128)))(
        inp, shape_in=(train_images.shape[1], 1)
    )

    out = nengo_dl.Layer(tf.keras.layers.Dense(units=10))(h)
    p = nengo.Probe(out)

Note that the Bidirectional Keras implementation only works with Keras layers, so you couldn’t use it with e.g. the LMUCell implementation in the NengoDL example you mention (which is a Nengo network, not a Keras layer). If you wanted to implement a Bidirectional RNN directly in Nengo then you’d need to do something like manually reversing the input with two LMUCell Networks, as you suggest.

1 Like

I see where the problem was bidirectional Keras wrapper is not supposed to work with Nengo layers. Thanks for your answer, do you think developing a bidirectional wrapper similar to the one in Keras in Nengo is be doable/useful?

What is the intented way to reverse the input for a layer, there seems to be no parameter like

go_backwards=False

in Keras recurrent layers. Is using nengo.Connection param function the intended way to do something like this?

def reverse_arr(arr):
  return arr[::-1]

nengo.Connection(x,y,function=reverse_arr)

so for the LMU example something like this?

def reverse_arr(arr):
  return arr[::-1]

with nengo.Network(seed=seed) as net:
    # remove some unnecessary features to speed up the training
    nengo_dl.configure_settings(
        trainable=None, stateful=False, keep_history=False,
    )
    
    # input node
    inp = nengo.Node(np.zeros(train_images.shape[-1]))

    # lmu cell
    lmu1 = LMUCell(
        units=106, 
        order=128, 
        theta=train_images.shape[1], 
        input_d=train_images.shape[-1]
    )
    lmu2 = LMUCell(
        units=106, 
        order=128, 
        theta=train_images.shape[1], 
        input_d=train_images.shape[-1]
    )
    conn1 = nengo.Connection(inp, lmu1.x, synapse=None)
    conn2 = nengo.Connection(inp, lmu2.x, synapse=None, function=reverse_arr)
    net.config[conn1].trainable = False
    net.config[conn2].trainable = False

    # dense linear readout
    out = nengo.Node(size_in=10)
    nengo.Connection(lmu1.h, out, transform=nengo_dl.dists.Glorot(), synapse=None)
    nengo.Connection(lmu2.h, out, transform=nengo_dl.dists.Glorot(), synapse=None, function=reverse_arr)

    # record output. note that we set keep_history=False above, so this will
    # only record the output on the last timestep (which is all we need
    # on this task)
    p = nengo.Probe(out)

Hi ondrysak,

I have not worked with bidirectional layers before, but from what I understand, if you are trying to get two duplicate layers to run with reverse input of one another, and then sum their outputs, then what you have should do the trick.