Unfortunately there isn’t really a way to work around this TensorFlow limitation in TensorFlow 2 (the method you link to, using placeholders to feed in values, only works in TensorFlow < 2.0, as TensorFlow 2 did away with the notion of placeholders).
However, I believe this issue should be resolved when we switch NengoDL to using TensorFlow’s eager-mode (which doesn’t have this 2GB limitation), rather than graph-mode. In the past eager-mode was significantly slower than graph-mode, but they have recently made speed improvements so we are planning to switch to eager-mode in the near future.