Training data size and event distribution for LMU time series

Hello all, I have done quite a bit of research, and read the paper numerous times, however I haven’t been able to get a clear understanding that can provide an answer to my below questions. Thank you in advance for any help and clarity you can offer!

I have a few questions, as follows:

[Question 1] I understand the LMU to function similar to a standard RNN, as it relates to time series data. Does a sliding window work with the LMU for event based data?

[Question 2] Does unbalanced data have an effect on the LMU?

[2a] Is small batch training something compatible with the LMU?

[Question 3] Should the “size” of the LMU be determined by the total amount of training data, or, rather, the size of each subset of data (i.e., each individual timeseries).

Thank you very much in advance, and if you have any questions or need clarification, please let me know.

These sound like research questions to me! I don’t know of any existing applications of LMUs to event-based data of the type that you’re describing, so I think these are open questions.

To answer them, I would start by creating a simple LMU memory system (that is, just the dynamical system defined by the A and B matrices), and seeing how well it can encode/decode some of your event-based data. Here’s a notebook that looks at the ability of an LMU memory to encode some simple signals:
lmu-basics.ipynb (676.5 KB)

In terms of training with unbalanced data, I think the LMU is not special in this regard, since it’s simply a type of neural network layer, and the problems of unbalanced data have more to do with the training process (specifically the loss function). If you have unbalanced data, I would recommend using a weighting scheme that weights rarer examples more highly in your loss function, or presenting examples in a more balanced manner. That said, I can’t comment on your specific problem, and it may be possible to get reasonable results without accounting for the unbalanced data in this way.

Haha, I had a feeling I would get an answer along those lines @Eric but I really appreciate the information you provided, as well as the notebook. Sounds like I have some work/research to do!