Hi everyone, I enjoy using Nixtla libraries partic...
# general
e
Hi everyone, I enjoy using Nixtla libraries particularly
mlforecast
. They are quite speedy. On the other hand, after upgrading the
xgboost
on
Colab
, I realized that the forecasts have changed dramatically. Could you please check the pic and the notebook below? Any suggestions will be welcomed. @Max (Nixtla) @fede (nixtla) (they/them)
đź‘€ 1
m
Sure thing! @José Morales y currently updating some stuff. He might be able to help.
đź‘Ť 1
j
Hi @Emre Varol, thanks for the reproducible example. The differences are coming from the changes in the default parameters of xgboost, the colab version is very old so it's hard to exactly pinpoint the main differences, however it seems that using the parameters that were the default in 0.9 reduces the differences. Running
xgb.XGBRegressor(objective='reg:squarederror').get_params()
on xgb 0.9 returns:
Copy code
{'base_score': 0.5,
 'booster': 'gbtree',
 'colsample_bylevel': 1,
 'colsample_bynode': 1,
 'colsample_bytree': 1,
 'gamma': 0,
 'importance_type': 'gain',
 'learning_rate': 0.1,
 'max_delta_step': 0,
 'max_depth': 3,
 'min_child_weight': 1,
 'missing': None,
 'n_estimators': 100,
 'n_jobs': 1,
 'nthread': None,
 'objective': 'reg:squarederror',
 'random_state': 0,
 'reg_alpha': 0,
 'reg_lambda': 1,
 'scale_pos_weight': 1,
 'seed': None,
 'silent': None,
 'subsample': 1,
 'verbosity': 1}
so using the following in your
get_forecast
function with xgb 1.6 I get pretty much the same results:
Copy code
import math
    params = {
        'base_score': 0.5,
        'booster': 'gbtree',
        'colsample_bylevel': 1,
        'colsample_bynode': 1,
        'colsample_bytree': 1,
        'gamma': 0,
        'importance_type': 'gain',
        'learning_rate': 0.1,
        'max_delta_step': 0,
        'max_depth': 3,
        'min_child_weight': 1,
        'missing': math.nan,
        'n_estimators': 100,
        'n_jobs': 1,
        'nthread': None,
        'objective': 'reg:squarederror',
        'random_state': 0,
        'reg_alpha': 0,
        'reg_lambda': 1,
        'scale_pos_weight': 1,
        'seed': None,
        'silent': None,
        'subsample': 1,
        'verbosity': 1
    }
    model = xgb.XGBRegressor(**params)
e
many thanks @José Morales