#random

Title

# random

t

Tyler Blume

07/21/2023, 2:16 PMRandom, more theoretical question, has anyone ever tried training a sequence model to take in the time series as input but the output sequence is the best performing stat model rather than the actuals? Seems to me like the output sequence could contain signals that aren't in the input sequence (major jumps or changepoints) but if the output sequence is based on a stat output it would. Let me know if I'm crazy but either way a fast 'auto-stat emulator' would be neat.

k

Kin Gtz. Olivares

07/21/2023, 2:22 PMHey **@Tyler Blume**,
This is an exciting idea.
You can obtain the outputs of the models as the parameters of distributions using the

`return_params`

option:
https://nixtla.github.io/neuralforecast/losses.pytorch.html#distributionloss
https://nixtla.github.io/neuralforecast/losses.pytorch.html#gaussian-mixture-mesh-gmm
With the parameters as outputs, you can simulate the model output instead of only having a single forecast.
This is almost a generative model.The Gaussian Mixture distribution would allow you to exhibit some jumps and "regime switches".
If you go a bit into that rabbit hole, the new DeepAR model does a Monte Carlo simulation to generate its quantiles too.

m

Manuel

07/21/2023, 2:47 PMBasically you would train a sequence model to simulate a stat model (e.g. ARIMA). It's something I've thought about in the past to handle a mix of long and short time series. For example, ARIMA requires at least 1 year of data to be able to make a forecast with yearly seasonality (and 2 years to be able to fit the model). For example, what you could do is fit ARIMA and make predictions with ARIMA for sufficiently long time series, train a neural model targeting the ARIMA predictions and then you could use this neural model to perform predictions for time series shorter than 1 year (perhaps with some exogenous variables that can help the model, for example calendar features)

t

Tyler Blume

07/21/2023, 2:50 PM2 Views