Hi everyone! I am trying this, still higher batch ...
# general
i
Hi everyone! I am trying this, still higher batch size are being used by models nhits_default_config = AutoNHITS.get_default_config(h = horizon, backend="optuna") def config_nhits(trial): config = {**nhits_default_config(trial)} config.update({ "early_stop_patience_steps":trial.suggest_categorical("early_stop_patience_steps", [3, 3]), "windows_batch_size":32, }) return config
m
In your config update, can you try setting
"batch_size":32"
i
@Marco Do you want me to pass batch_size instead of windows_batch_size or both?
@Marco Using this- def config_nbeats(trial): config = {**nbeats_default_config(trial)} config.update({ "early_stop_patience_steps":trial.suggest_categorical("early_stop_patience_steps", [3, 3]), "batch_size":32, }) return config I am getting
Copy code
Trial 1 failed with parameters: {'n_pool_kernel_size': [8, 4, 1], 'n_freq_downsample': [24, 12, 1], 'learning_rate': 0.00023747165956426155, 'scaler_type': None, 'max_steps': 1000.0, 'batch_size': 128, 'windows_batch_size': 256, 'random_seed': 11, 'input_size': 60, 'step_size': 1, 'early_stop_patience_steps': 3} because of the following error: Exception('No windows available for training').
Traceback (most recent call last):
Its resolved by passing input size instead of batch size. Thanks
1