Hi all, Is there any notebook example in which I c...
# neural-forecast
m
Hi all, Is there any notebook example in which I can see how to implement AutoNBEATSx? So far I have seen this (for NBEATSx without hyperparmenter tunning) and this (for hyperparameter tunning without exogenous variables). For example, when I run this: models = [AutoNBEATSx(h = n_day_forecast, loss=MAE(), config=nbeats_config, input_size = 5*24, futr_exog_list = fut_exog_list, hist_exog_list = hist_exog_list, stat_exog_list = static_list, scaler_type = 'robust', search_alg=HyperOptSearch(), num_samples=20)] I have the following errors: TypeError: AutoNBEATSx.__init__() got an unexpected keyword argument 'input_size' TypeError: AutoNBEATSx.__init__() got an unexpected keyword argument 'futr_exog_list' Thank you very much!
k
Hey @Manuel Chabier Escolá Pérez Here are two pointers that may help you. Hyperparameter optimization tutorial: https://nixtla.github.io/neuralforecast/examples/automatic_hyperparameter_tuning.html AutoNBEATSx code: https://github.com/Nixtla/neuralforecast/blob/main/neuralforecast/auto.py#L454
m
Thank you very much for your quick response, Kin. However, I still do not understand it... In the N-BEATSx class (link) I clearly see how to include the parameters (including
stat_exog_list
and
futr_exog_list
). However, when I read the AutoNBEATSx class (link) I do not see how or where these two parameters have to be indicated. The same with AutoNHITS. I get the error TypeError: AutoNHITS.__init__() got an unexpected keyword argument 'futr_exog_list'.
k
You have to put them in the configuration search space defined by the
config
, like this:
Copy code
nbeatsx_config = {
       "max_steps": 100,                                                         # Number of SGD steps
       "input_size": 24,                                                         # Size of input window
       "learning_rate": tune.loguniform(1e-5, 1e-1),                             # Initial Learning rate
       "n_pool_kernel_size": tune.choice([[2, 2, 2], [16, 8, 1]]),               # MaxPool's Kernelsize
       "n_freq_downsample": tune.choice([[168, 24, 1], [24, 12, 1], [1, 1, 1]]), # Interpolation expressivity ratios
       "stat_exog_list": tune.choice([['s1','s2'],['s1']]),
       "futr_exog_list": tune.choice([['f1','f2'],['f1']]),
       "val_check_steps": 50,                                                    # Compute validation every 50 steps
       "random_seed": tune.randint(1, 10),                                       # Random seed
    }