I have one question, when are the hyperparameters ...
# mlforecast
f
I have one question, when are the hyperparameters tuned ? I thought when you did crossvalidation the model optimized the hyperparameters but its just doing a normal crossvalidation
👍 1
j
cross validation refers to doing just that, evaluating the same model over different folds. If you want to do hyperparameter tuning you have to define a search space and evaluate different combinations (for which you can use CV)
f
Can you go a little more in depth? What do you mean I can do crossvalidation to do that? how do i define a search space? are there any tutorials/examples? when I read the MLforecast documentation I havent found any of that, does the NIXTLA libraries have any function to do that or do i have to built it myself?
j
I mean you can use the CV score as your evaluation for a specific combination. the search space depends on what you're trying to optimize (features, hyperparameters, both). There isn't anything builtin but there's an example using LightGBMCV here. You can either use that or just run MLForecast.cross_validation and compute a score from it
f
I already did the crossvalidation and did a function to compute the score for each model. But that isnt optimizing the features or the hyperparameters, it is using the default ones. How can i check the hyperparameters of lets say xgboost?
j
you set them when you define the models, e.g.
Copy code
MLForecast(models=[XGBRegressor(max_depth=3)])
f
is there a way to see the hyperparameter matrix?
j
do you mean seeing which arguments the xgboost constructor takes?
j
v
@José Morales has something on hyperparameter optimisation been added to mlforecast since this discussion? I am trying to find some materials but there does not seem to be any. HPO is key stage of optimising boosted trees to obtain good performance.
j
We have this example using LightGBMCV which includes trial prunning, you could do something similar with the MLForecast class. Also @Tyler Blume has this nice wrapper and we're working with him on incorporating it to mlforecast.
👍 1
v
Thank you @José Morales does mlforecast only work with LightGBM or other models like CatBoost as well?
j
It works with all models that follow the scikit-learn API
👍 1
v
@José Morales presumably this includes CatBoost?
j
It should, there's an example here. There's a lot of stuff but you can search for the "Building Model" section
👍 1
v
That looks great thank you @José Morales
@José Morales is there a way to get SHAP explainability like in this example https://github.com/narencastellon/mlforecast/blob/catboost_regresor/nbs/docs/tutorials/catboost_regressor_forecasting.ipynb after mlforecast model has been trained with catboost?
j
This process should work for any model
🙌 1
v
Nice, thank you @José Morales
a
@José Morales Any luck on applying Shap for neural-forecast?I probably have to ask in the other channel, but saw this shap being tutorialed here for mlforecast https://nixtlaverse.nixtla.io/mlforecast/docs/how-to-guides/analyzing_models.html#shap