Hi there, I met a small issue when I trying to ind...
# neural-forecast
u
Hi there, I met a small issue when I trying to induce the exogenous variables into the AutoModel with HypterParameter tuning. Let's say, made a historical exogenous variable called 'x_7' "hist_exog_list": tune.choice(['x_7'])
Copy code
### NBeatsx HyperParameter Config
nbeatsx_config = {
       "input_size": tune.choice([1 * input_sequence]),                  # input_size = multiplier * horizon
       "step_size": tune.choice([4]),                            # Window shift size
       "hist_exog_list": tune.choice(['x_7']), #df.columns[1:-2].tolist(),
}
And put the config to the model iniation then fit the trian input
Copy code
NBeatsx = AutoNBEATSx(h=output_sequence, config=nbeatsx_config, loss=MAE(), gpus=1, search_alg=HyperOptSearch(), backend='ray', num_samples=40)
nf = NeuralForecast(models=[NBeatsx], freq='H', local_scaler_type='standard')
nf.fit(Y_train_df, val_size=val_size, verbose=True)
Then, the training session failed with the output error
Copy code
File "/usr/local/lib/python3.10/dist-packages/neuralforecast/common/_base_auto.py", line 336, in _fit_model
    model.fit(dataset, val_size=val_size, test_size=test_size)
  File "/usr/local/lib/python3.10/dist-packages/neuralforecast/common/_base_windows.py", line 697, in fit
    raise Exception(
Exception: {'7', 'x', '_'} historical exogenous variables not found in input dataset
Anyone could please explain this? Also, one extra question is that, If I am going to have about 700 exogenous variables as the input, is it OKAY/BAD to have huge exogenous dimensions. Thanks!
j
Hey. The features must be a list, and the tune choice expects a sequence, so what's happening is that you're providing
'hist_exog_list': 'x_7'
and it should be
'hist_exog_list': ['x_7']
so the config for tune should be something like:
hist_exog_list: tune.choice([['x_7']])
About the 700 exogenous the problem is memory, so it's really up to the hardware, but i'd say it's probably going to run out of memory.
u
Thanks for the quick replying! After I changing to the 'list' as you shown above, the new Exception occurs:
Copy code
(_train_tune pid=2432)   File "/usr/local/lib/python3.10/dist-packages/neuralforecast/common/_base_windows.py", line 336, in _get_temporal_data_cols
(_train_tune pid=2432)     set(temporal_cols.tolist()) & set(self.hist_exog_list + self.futr_exog_list)
(_train_tune pid=2432) TypeError: can only concatenate tuple (not "list") to tuple
The value I used is pure integer value.
The way I fixed this is by setting the *futr_exog_list: tune.ch*oice([[]])) in the config dictionary... Could you please explain why is this happening (it must require hist and futr setting?), and the way I do this would affect the model training or not. Thanks!
j
Did you provide it like I suggested? Seems like the hist_exog_list is a tuple instead of a list
l
@王梦石, would you not want to do feature selection first before you feed into the nfc models since you have 700 exogenous features?
u
Hi Jose, I did exactly as your suggestion. If I do not specify the 'futr_exog_list' as an empty list, the error still exist there. It's kinda weird
@José Morales, Hi Jose, I did exactly as your suggestion. If I do not specify the 'futr_exog_list' as an empty list, the error still exist there. It's kinda weird. I have also checked the ray codes, it convert all the lists into tuples for the hyperparameter options to tune.
j
I think this is what it does internally, so if you specify it as a list it should return it as a list