Hello everyone,
I am new to Nengo and LMU. I was wondering, how could I stack 3 LMU layers in Tensorflow?
Single LMU Layer:
def create_lmu_model():
lmu_layer = keras_lmu.LMU(
memory_d=1,
order=12,
theta=64,
hidden_cell=tf.keras.layers.SimpleRNNCell(16),
hidden_to_memory=False,
memory_to_memory=False,
input_to_hidden=True,
kernel_initializer="ones",
)
# TensorFlow layer definition
inputs = tf.keras.Input((n_steps, 1))
lmus = lmu_layer(inputs)
#lmus = lmu_layer(lmus)
outputs = tf.keras.layers.Dense(step_future)(lmus)
# TensorFlow model definition
model = tf.keras.Model(inputs=inputs, outputs=outputs)
return model
Stacked LMU layers?
I look forward to hearing from you.
Kind regards,
Kakpo
xchoo
February 2, 2022, 2:48am
2
Hi @adokoka , and welcome to the Nengo forums.
First, a point of clarification… If by “stacking” you mean to ask how to create a network where you have a structure like so: input → LMU1 → LMU2 → LMU3 → …, then you almost have the correct code for stacking multiple LMU layers. The important thing to note is that the keras_lmu.LMU(...)
call creates one LMU layer. Thus, to stack multiple LMU layers, you’ll need to create each LMU layer first:
def create_lmu_model():
lmu_layer1 = keras_lmu.LMU(...)
lmu_layer2 = keras_lmu.LMU(...)
lmu_layer3 = keras_lmu.LMU(...)
Then, you can use the standard TensorFlow model definition to connect the layers together:
inputs = ...
lmus = lmu_layer1(inputs)
lmus = lmu_layer2(lmus)
lmus = lmu_layer3(lmus)
outputs = ...(lmus)
...
And that should do it for you!
1 Like
adokoka
February 2, 2022, 12:36pm
3
Hi Xuan,
Thank you very much for your quick response!
I meant “stacking” indeed as in:
inputs → LMU1 → LMU2 → LMU3 → … outputs
Your answer works very well. I will go further with the model. I will give you a shout or the community if I need any details.
Once again, many thanks for your help.
Kind regards,
Kakpo
1 Like