KerasLMU 0.4.0 released

The KerasLMU team is happy to announce the release of KerasLMU 0.4.0.

What is KerasLMU?

KerasLMU is a Keras-based implementation of Legendre Memory Units, a novel memory cell for recurrent neural networks that dynamically maintains information across long windows of time using relatively few resources. It has been shown to perform better than standard LSTM or other RNN-based models in a variety of tasks, generally with fewer internal parameters (see the paper for more details).

How do I use it?

KerasLMU is built using the standard Keras RNN API. If you have a model containing an RNN layer, such as tf.keras.layers.LSTM(...) or tf.keras.layers.RNN(tf.keras.layers.LSTMCell(...)), those can be swapped with keras_lmu.LMU(...) or tf.keras.layers.RNN(keras_lmu.LMUCell(...)).

More information on the available parameters and configuration options can be found in the documentation.

What’s new?

The main new feature in this release is the option to learn the theta parameter of the LMU during training. This comes with an option to choose which discretization method to use when computing the A/B matrices (this is mainly useful in combination with trainable_theta=True, where setting discretizer="euler" may improve the training speed, possibly at the cost of some accuracy).

This release also includes a number of improvements to the feedforward LMU implementation (which has been renamed from LMUFFT to LMUFeedforward). It now supports memory_d > 1, meaning that the feedforward implementation can be used in a wider variety of LMU models. There is also a new conv_mode option, which can be used to switch to a raw convolution (rather than FFT-based) implementation, which may be faster depending on the model. Note that if you are using keras_lmu.LMU(...), then KerasLMU will automatically use LMUFeedforward under the hood if the necessary conditions are met (hidden_to_memory=False, memory_to_memory=False, and fixed sequence length – i.e., the shape of the time axis in the inputs is not None).

Check out the GitHub release page for a full changelog.

How do I get it?

To install KerasLMU, we recommend using pip:

pip install keras-lmu

More detailed installation instructions can be found here.

Where can I learn more?

Where can I get help?

You’re already there! If you have an issue upgrading or have any other questions, please post them in this forum.

somebody who understands how LMU works, pleeeeeeeeeeeeeaaaaaaaaaase make one or more (as many as needed) tutorial videos about LMU and its deep mathematical concepts or whatever it is

Thank you