lobbie lobbie
08/28/2023, 11:13 PMCristian (Nixtla)
08/29/2023, 2:19 PMAuto
model, for example:
model = AutoNHITS(h=8,
loss=MAE(),
config=config,
search_alg=HyperOptSearch(),
num_samples=20)
The reason for this is that models with different h
are not directly comparable.
2. No, the frequency should always be the sampling of your data, in this case W-MON
. To specify forecasting 8 weeks ahead set the horizon h
to 8.
3. The default scaler_type
varies depending on the base model. If you do not specify the scaler_type
in the config dictionary, it will use the default of the base model (same behavior for all parameters). You can turn off scaling by setting scaler_type=None
.
4. Yes, all numeric features that change in time should be either a hist
or futr
variable. BOTH set of features should be present in the historic data, during both training and with the predict
method. In the futr_df
of the predict
method, only add the futr
variables. We just fixed a bug regarding exogenous variables, so I recommend updating your code with the latest changes of the main branch.
5. Yes, binary and numeric variables are treated equally by the models.input_size
, increasing training times. You can start adding the most informative variables (if you know them), and compare the performance and training times as you add more variables.lobbie lobbie
08/30/2023, 7:39 AMTypeError: __init__() got an unexpected keyword argument 'stat_exog_list'
My first attempt of writing the code is
horizon = 8
models=[
AutoRNN(h = horizon
, config = None
, stat_exog_list = mystat_exog_list # <- Static exogenous variables
, hist_exog_list = myhist_exog_list # <- Historical exogenous variables
, futr_exog_list = myfutr_exog_list # <- Future exogenous variables
, loss = MQLoss()
, num_samples = 2
#, search_alg=HyperOptSearch()
, cpus=4
, scaler_type = 'robust')
, AutoTFT(h = horizon
, config = None
, stat_exog_list = mystat_exog_list # <- Static exogenous variables
, hist_exog_list = myhist_exog_list # <- Historical exogenous variables
, futr_exog_list = myfutr_exog_list # <- Future exogenous variables
, loss = MQLoss()
, num_samples = 2
#, search_alg=HyperOptSearch()
, cpus=4
, scaler_type = 'robust')
]
nf = NeuralForecast(models = models, freq = 'W-MON')
# fit the model
nf.fit(df = historic_df, static_df=static_df)
Any ideas where I am doing wrong?Cristian (Nixtla)
08/30/2023, 5:19 PMlobbie lobbie
08/30/2023, 10:26 PMCristian (Nixtla)
08/30/2023, 10:29 PMconfig
, not directly in the Auto
class. Here is the documentation: https://nixtla.github.io/neuralforecast/examples/automatic_hyperparameter_tuning.html.Auto
class are the h
, config
, loss
, search_alg
, num_samples
.lobbie lobbie
08/30/2023, 10:30 PM"random_seed": tune.randint(1, 10),
how do I set a seed for reproducible research? It seems that the global seed is always different every time i run the model. thanks.