I have never had the below issue before, so I am wondering if this is a specific issue to the LMU, and if so, what if any options are available.
I created a model (which took many months of trial and error to arrive at it’s current state) from timeseries data with 3 columns of data, with an RNN in the style of an LSTM (sliding window, etc.). The data was scaled using MinMax of the files when creating the model. Forward tests show great results.
When emulating live streaming data by processing historical data row-by-row, rather than prescaling the data in the forward test. Again, the results are positive and identical to the previous forward test. However, in real life, the true MinMax couldn’t be known in advance for one of the columns, so I have experimented over the course of about a week using averages of historical data for the MinMax, and numerous other variations to set a MinMax manually that would work. I also tried using MinMax with partial fit. So far, I have been unable to produce the excellent results when using the true MinMax, and in fact, the model is as of yet unusable.
My question is, is this something unique to the LMU, and if so or not, are there any ideas on how to resolve this scaling issue. I have tried recreating the model using StandardScaler or some other scaler without success which was expected considering the hyperparameters tuning, it would be like starting over from scratch.
I would be deeply grateful for any assistance that could be given, ideas, or just speculation on how I can resolve this issue. I am hoping I haven’t wasted months of time developing this fantastic model that ends up being unusable.
Thanks in advance!