```modelLSTM = AutoLSTM(h=h, ...
# neural-forecast
b
Copy code
modelLSTM = AutoLSTM(h=h,
                     loss=MAE(),
                     backend='optuna',
                     num_samples=10) 

nf = NeuralForecast(models=[modelLSTM], freq='ME')
nf.fit(df=df_encoded, val_size=18)
Hi. When I do this I get an error stating Exception: Time series is too short for training, consider setting a smaller input size or set start_padding_enabled=True after running for a while, where are we expected to put the argument set_padding_enabled?
m
Hello! You do it in the LSTM model. Something like:
Copy code
config = dict(start_padding_enabled=True)
model = AutoLSTM(h=h, config=config)
b
Thanks! How can I perform cross validation with the best LSTM configuration found with AutoLSTM? My current code is this:
Copy code
def config_lstm(trial):
    return {
        "start_padding_enabled": True}

modelLSTM = AutoLSTM(h=h,
                     config=config_lstm,
                     loss=MAE(),
                     backend='optuna',
                     num_samples=10) 

nf = NeuralForecast(models=[modelLSTM], freq='ME')
nf.fit(df=df_encoded, val_size=18)
I have added the following, however training times are very long. I believe neural forecast is also global. MLforecast takes me max 3 mins for the same settings, whereas this takes me 2 hours. Does the following take the best configuration? And how can I improve my performance as currently it is performing really bad.
Copy code
cv_result_lstm = nf.cross_validation(
    df=df_encoded,
    n_windows=n_windows,
    step_size=step_size,
    val_size=18,
    id_col='unique_id',
    time_col='ds',
    target_col='y')
image.png
m
You can read our tutorial on hyperparameter optimization to see how you can get the config for each run and then select the best config. For better results, I would suggest using another model than LSTM, maybe something like NHITS would perform better, and would be faster to train.
b
I have looked at the website but I can't retrieve where I can get the best config. I used this for ML: config_lgb = auto_mlf.results_['my_lgb'].best_trial.user_attrs['config'], but this does not apply to neuralforecast