This message was deleted.
# neural-forecast
s
This message was deleted.
👀 1
c
Hi @lobbie lobbie! The
Auto
models are already doing cross-validation under the hood to select optimal hyperparameters. You simply need to instantiate the model and use either the
fit/predict
or
cross_validation
methods. You can specify the size of the validation set with
val_size
. During the validation step, the model is evaluated on the entire validation set using rolling windows without refit. See image below. You cant access the loss per window directly, only the average in the
validation_trajectories
object with
nf.models[0].train_trajectories
.
After doing hyperparameter selection. You can access all the configurations tried and their final loss, following this tutorial: https://nixtlaverse.nixtla.io/neuralforecast/examples/automatic_hyperparameter_tuning.html. After selecting the best configuration, it will refit the model, you can select using the validation set or not with
refit_with_val
.
Note that the main usage of the
cross_validation
method is to recover the rolling predictions for a third test set split, so you dont need to call the
predict
method multiple times.
l
thank you @Cristian (Nixtla). so if I want to use an anchor instead of rolling window i.e. from t1, should I set the step size = 0? the picture you showed is not forecasting for t9 and t10. does it means nfc does not use the remaining periods for forecasting in the 3rd test set split?