Simon Lin
04/12/2024, 5:04 PMconfig_nhits = {
"input_size": tune.choice([30, 30*2, 30*3]), # Length of input window
"start_padding_enabled": True,
"n_blocks": 5*[1], # Length of input window
"mlp_units": 5 * [[64, 64]], # Length of input window
"n_pool_kernel_size": tune.choice([5*[1], 5*[2], 5*[4],
[8, 4, 2, 1, 1]]), # MaxPooling Kernel size
"n_freq_downsample": tune.choice([[8, 4, 2, 1, 1],
[1, 1, 1, 1, 1]]), # Interpolation expressivity ratios
"learning_rate": tune.loguniform(1e-4, 1e-2), # Initial Learning rate
"scaler_type": tune.choice(['robust']), # Scaler type
"max_steps": tune.choice([1000,1500]), # Max number of training iterations
"batch_size": tune.choice([1, 4, 10]), # Number of series in batch
"windows_batch_size": tune.choice([128, 256, 512]), # Number of windows in batch
"random_seed": tune.randint(1, 20), # Random seed
"stat_exog_list ": tune.choice([["L","AAM","ARC"],None]),
}
config_lstm = {
"input_size": tune.choice([30, 30*2, 30*3]), # Length of input window
"encoder_hidden_size": tune.choice([64, 128]), # Hidden size of LSTM cells
"encoder_n_layers": tune.choice([2,4]), # Number of layers in LSTM
"learning_rate": tune.loguniform(1e-4, 1e-2), # Initial Learning rate
"scaler_type": tune.choice(['robust']), # Scaler type
"max_steps": tune.choice([800, 500]), # Max number of training iterations
"batch_size": tune.choice([1, 4]), # Number of series in batch
"random_seed": tune.randint(1, 20), # Random seed
"stat_exog_list ": tune.choice([["L","AAM","ARC"],None]),
}
HORIZON = 30
nf = NeuralForecast(
models=[
AutoNHITS(h=HORIZON, config=config_nhits, loss=MQLoss(), num_samples=25),
AutoLSTM(h=HORIZON, config=config_lstm, loss=MQLoss(), num_samples=20),
],
freq='D'
)
nf.fit(df=train_set, static_df=static_df, val_size=30)
Then I get the error about unexpected keyword:
TypeError: Trainer.__init__() got an unexpected keyword argument 'stat_exog_list '
Marco
04/12/2024, 5:15 PMstat_exog_list
.
Otherwise, it should work 🙂Simon Lin
04/12/2024, 5:19 PM