https://github.com/nixtla logo
#mlforecast
Title
# mlforecast
f

Francisco

10/25/2023, 10:11 PM
I have one question, when are the hyperparameters tuned ? I thought when you did crossvalidation the model optimized the hyperparameters but its just doing a normal crossvalidation
j

José Morales

10/25/2023, 10:19 PM
cross validation refers to doing just that, evaluating the same model over different folds. If you want to do hyperparameter tuning you have to define a search space and evaluate different combinations (for which you can use CV)
f

Francisco

10/25/2023, 10:27 PM
Can you go a little more in depth? What do you mean I can do crossvalidation to do that? how do i define a search space? are there any tutorials/examples? when I read the MLforecast documentation I havent found any of that, does the NIXTLA libraries have any function to do that or do i have to built it myself?
j

José Morales

10/25/2023, 10:30 PM
I mean you can use the CV score as your evaluation for a specific combination. the search space depends on what you're trying to optimize (features, hyperparameters, both). There isn't anything builtin but there's an example using LightGBMCV here. You can either use that or just run MLForecast.cross_validation and compute a score from it
f

Francisco

10/25/2023, 10:55 PM
I already did the crossvalidation and did a function to compute the score for each model. But that isnt optimizing the features or the hyperparameters, it is using the default ones. How can i check the hyperparameters of lets say xgboost?
j

José Morales

10/25/2023, 11:09 PM
you set them when you define the models, e.g.
Copy code
MLForecast(models=[XGBRegressor(max_depth=3)])
f

Francisco

10/25/2023, 11:18 PM
is there a way to see the hyperparameter matrix?
j

José Morales

10/25/2023, 11:33 PM
do you mean seeing which arguments the xgboost constructor takes?
j

Jason Gofford

10/26/2023, 5:05 AM
3 Views