The NengoDL team is happy to announce the release of NengoDL 3.3.0.
What is NengoDL?
NengoDL is a backend for Nengo that integrates deep learning methods (supported by the TensorFlow framework) with other Nengo modelling tools. NengoDL allows users to optimize their models using deep learning training methods, improves simulation speed (on CPU or GPU), can automatically convert Keras models to Nengo networks, and makes it easy to insert TensorFlow code (individual functions or whole network architectures) into Nengo networks.
How do I use it?
To use NengoDL, replace instances of
For example, if you have a network called
net and you run it as
with nengo.Simulator(net) as sim: sim.run(10)
you would change that to
with nengo_dl.Simulator(net) as sim: sim.run(10)
and that’s it!
Information on accessing the more advanced features of NengoDL can be found in the documentation.
The most significant feature of this release is that we have switched NengoDL to operate in TensorFlow’s “eager mode” by default. If you were just building models in Nengo and using the NengoDL Simulator, you probably won’t notice any difference as a result of this change. But if you are integrating NengoDL with other TensorFlow features (such as data pipelines, training callbacks, or using TensorNodes to embed TensorFlow/Keras code), you should find that that all just works more smoothly now. For some models we still find that TensorFlow’s “graph mode” is faster than eager mode, so we’ve supported both. See the changelog or the documentation for instructions on how to switch if you notice a slow down. We also added support for some of the new neuron types introduced in Nengo core (including
RegularSpiking). And as always, we fixed some bugs! Check out the GitHub release page for a full changelog.
How do I get it?
To install NengoDL, we recommend using
pip install nengo-dl
More detailed installation instructions can be found here.
Where can I learn more?
Where can I get help?
You’re already there! If you have an issue upgrading or have any other questions, please post them in this forum.