KerasLMU 0.3.0 released

The KerasLMU team is happy to announce the release of KerasLMU 0.3.0.

What is KerasLMU?

KerasLMU is a Keras-based implementation of Legendre Memory Units, a novel memory cell for recurrent neural networks that dynamically maintains information across long windows of time using relatively few resources. It has been shown to perform better than standard LSTM or other RNN-based models in a variety of tasks, generally with fewer internal parameters (see the paper for more details).

How do I use it?

KerasLMU is built using the standard Keras RNN API. If you have a model containing an RNN layer, such as tf.keras.layers.LSTM(...) or tf.keras.layers.RNN(tf.keras.layers.LSTMCell(...)), those can be swapped with keras_lmu.LMU(...) or tf.keras.layers.RNN(keras_lmu.LMUCell(...)).

More information on the available parameters and configuration options can be found in the documentation.

What’s new?

Previous users will note that the name of the package has changed from “NengoLMU” to “KerasLMU” (as this better reflects the Keras-based implementation of this package, and we have plans to make a separate Nengo-based implementation in the future). Along with this name change the package/module name has changed, so instead of doing pip install lmu and import lmu you now do pip install keras-lmu and import keras_lmu.

In addition, we have significantly reworked the LMU API, and introduced a number of new features. Some of the most significant changes are:

  • Removed a bunch of elements that we haven’t really found to be that useful in our experimentation with LMUs, like individual trainable/initializer arguments for each connection within the LMU, or the LMUCellODE class (although we have plans to implement an improved method for optimizing the memory component in the future).
  • Added the ability to selectively enable/disable connections within the LMU between the hidden and memory components. Note that these default to disabled, so the new default LMU will be a trimmed-down version of the old defaults.
  • Added support for multi-dimensional memories.
  • Added support for arbitrary hidden components (anything that implements the Keras RNNCell API can be used, e.g. tf.keras.layers.LSTMCell).

Check out the GitHub release page for a full changelog.

How do I get it?

To install KerasLMU, we recommend using pip:

pip install keras-lmu

More detailed installation instructions can be found here.

Where can I learn more?

Where can I get help?

You’re already there! If you have an issue upgrading or have any other questions, please post them in this forum.