Hello everyone, this is my first time here so plea...
# mlforecast
p
Hello everyone, this is my first time here so please pardon me if this question has been asked. After performing hyperparameter optimization in MLForecast library using Auto classes such as AutoXGBoost and AutoLightGBM, is there any direct way to perform cross validation on the newly retrained models. Thank you so much
j
Hey. Do you mean on the model that is retrained at the end?
p
Hi Jose. Thank you for the reply. To clarify, "ml_forecaster" object has a built-in method called "cross_validation" that is very helpful. Moving onto "auto_ml" forecaster, say I have "auto_mlf = AutoMLForecast( models={'xgb': AutoXGBoost(), 'lgbm' : AutoLightGBM()}, freq='M', season_length=12, ) auto_mlf.fit( rf_df, n_windows=5, h=12, num_samples=10, )" Is there any way to perform cross validation directly on the optimized model, or do we have to specify the optimized params for each model and parse back into the "ml_forecaster" object.
j
The fit method creates a
models_
attribute which has the best models trained on the full dataset, so for example
auto_mlf.models_['xgb']
will hold an
MLForecast
object with a single model (an XGBRegressor), so you could perform cross_validation on that if you want with
auto_mlf.models_['xgb'].cross_validation(...)
. But note that this has already been carried out (unless you specify different settings like
n_windows
or
h
) and you can get the CV score from
auto_mlf.results_
which is a list with one optuna study per model
p
🙏🙏🙏🙏 thank you so much Jose
🙌 1