This message was deleted.
# neural-forecast
s
This message was deleted.
d
Oh, as an aside. Is there a way to include lagged features in the neuralforecast models?
Last question, I promise! When I forecast over different horizons, say
h=1
vs
h=2
, I get very different results with respect to the one-period ahead forecast. Why would this be the case? I am determining the size of the validation set in the
fit
method. Are there other places where the horizon features in tuning of hyperparameters? I have fixed the
input_size
as well, so that the
input_size
is not dependent on the horizon.
k
Hey @Dawie van Lill, • The config seems correct, • The "input_size" controls for the lags/autorregresive features of the method, it is convenient to make it as multiples of horizon, (example for 24 hours ahead you may want input_size=2*24 or input_size=7*24). • Regarding the model variance between horizons, ◦ you can try an NBEATSx architecture with
dropout_prob_theta
regularization. ◦ a robust loss like MAE/HuberLoss (https://nixtla.github.io/neuralforecast/losses.pytorch.html#huber-loss). ◦ increase the valid_size, to improve the validation signal h=1, h=2, is a very small window and the hyperparameter optimization will be noisy (https://nixtla.github.io/neuralforecast/common.base_auto.html).
d
Thanks for the feedback. I am working with quarterly data and I am mostly interested in the one period ahead forecast, thus the
h=1
. I realise after playing around with the code that the
valid_size
is dependent on the horizon, so I fixed that to be about 10% of the entire sample size (at around 25). I am also interested in the longer term forecast
h=4
, which gives me all the quarterly forecasts up to one year ahead. However, this is where I notice a significant difference between my specification with
h=1
and
h=4
. Nothing is different between the models and I use the MAE loss function, like you specified. I might be missing something here. Then finally, I tried the regularisation with the NBEATSx architecture and NHITS, but I am getting some errors with regard to dropout. The code runs fine without it, but once I include the dropout component it breaks for some reason. I have
"dropout_prob_theta": tune.choice([0.1, 0.3])
specified in the config.
k
You might need to update NeuralForecast from main: !pip install git+https://github.com/Nixtla/neuralforecast.git