Hi, is there a way of doing hyperparameter tuning ...
# mlforecast
n
Hi, is there a way of doing hyperparameter tuning (LGBM or any other model) + feature selection at the same time by launching one crossval? And at the end showing the best hyperparameters and feature importance? Maybe it's not good practice to do it (I don't know), if not what would be the best way and how to do it please? In the current state, is possible to get feature importance after a crossval? Thank you very much. I have checked these 2 tutorials but couldn't find how to do it : https://nixtla.github.io/mlforecast/docs/cross_validation.html https://nixtla.github.io/mlforecast/lgb_cv.html#example
f
hey @Nasreddine D! Currently, mlforecast does not support automatic hyperparameter + feature tuning. Could you help us open an issue to work on that feature? https://github.com/Nixtla/mlforecast/issues/new/choose In the meantime, one option might be to use hyperopt or optuna to create a pipeline that receives different hyperaparmeters and features and optimizes them. Inside the function, you could include the
cross_validation
method to optimize the hyperparameters and features considering different windows.
👍 1
n
Hi @fede (nixtla) (they/them), I have used Optuna to tune a Lasso model. I had the following idea (please let me know if it is relevant): 1. Extract Features from an univariate TS (lags, rolling windows, expanding windows) 2. Tune the
Alpha
hyperparameter of Lasso Model with a
cross_validation
3. This best model with best alpha, will automatically select best features (put a 0 coefficient on "useless" features) 4. Then I can train a LGBM model with the best features selected from above steps. When I performed the first 3 steps, the model selected the best alpha, but all features coefficient were at 0 (so basically no feature selected). Please find below the I used.
Copy code
import optuna
from sklearn.linear_model import Lasso
import numpy as np

# Hyperparameter tuning
def objective(trial):
    alpha = trial.suggest_float('alpha', 0.00001, 1, log=True)

    models = [Lasso(alpha=alpha, random_state=0, max_iter=5000)]
    
    mlf = MLForecast(
                        models = models, 
                        freq = 1,
                        target_transforms=[Differences([12]), StandardScaler()],
                        lags=np.arange(1,37),
                        lag_transforms={
                                    1: [(rolling_mean, 12), (rolling_max, 12), (rolling_min, 12)],
                                    1: [(rolling_mean, 24), (rolling_max, 24), (rolling_min, 24)],
                                    1: [(rolling_mean, 6), (rolling_max, 6), (rolling_min, 6)],
                                    2: [(rolling_mean, 12), (rolling_max, 12), (rolling_min, 12)],
                                    2: [(rolling_mean, 24), (rolling_max, 24), (rolling_min, 24)],
                                    2: [(rolling_mean, 6), (rolling_max, 6), (rolling_min, 6)],
                                    3: [(rolling_mean, 12), (rolling_max, 12), (rolling_min, 12)],
                                    3: [(rolling_mean, 24), (rolling_max, 24), (rolling_min, 24)],
                                    3: [(rolling_mean, 6), (rolling_max, 6), (rolling_min, 6)],
                                    6: [(rolling_mean, 12), (rolling_max, 12), (rolling_min, 12)],
                                    6: [(rolling_mean, 24), (rolling_max, 24), (rolling_min, 24)],
                                    6: [(rolling_mean, 6), (rolling_max, 6), (rolling_min, 6)],
                                    12: [(rolling_mean, 12), (rolling_max, 12), (rolling_min, 12)],
                                    12: [(rolling_mean, 6), (rolling_max, 6), (rolling_min, 6)],
                    )

    crossvalidation_df = mlf.cross_validation(
                                                data=Y_ts,
                                                window_size=24,
                                                n_windows=30,
                                                step_size=1
                                            )
    
    cv_rmse = crossvalidation_df.groupby(['cutoff']).apply(lambda x: rmse(x['y'].values, x["Lasso"].values)).to_frame().mean()
    return cv_rmse

study_lasso = optuna.create_study(direction='minimize')
study_lasso.optimize(objective, n_trials=50)
The results below when I retrieve the coefficients after training
mlf_lasso.models_["Lasso"].coef_