Steffen
04/19/2024, 8:50 AMlr_schedulers
to the models? Possibly through the trainer_kwargs
argument?
E.g., torch.optim.lr_scheduler: OneCycleLR to be used within a NBEATSx model.
I gave it some attempts without success.Jing Qiang Goh
04/19/2024, 10:00 AMSteffen
04/23/2024, 11:17 AMtrainer_kwargs
enables the passing of pytorch lightning trainer arguments and as various lr schedulers are implemented in pytorch lightning it might be possible to pass lr_schedulers through this (trainer_kwargs) attribute of the models? 💡🤔Jing Qiang Goh
04/23/2024, 2:11 PMtrainer_kwargs
refer to the arguments suggested by https://lightning.ai/docs/pytorch/stable/api/pytorch_lightning.trainer.trainer.Trainer.html which include accelerator
, enable_checkpointing
However, lr_scheduler
behavior needs to be customized via def configure_optimizer
as shown in https://lightning.ai/docs/pytorch/stable/api/pytorch_lightning.trainer.trainer.Trainer.html#lightning.pytorch.trainer.[…].estimated_stepping_batches
which requires a different treatment
cc: @José MoralesJing Qiang Goh
05/25/2024, 6:17 AMlr_scheduler
and lr_scheduler_kwargs
to use your own lr_scheduler. This support should cover most models including NBEATSx
If you want to have full control of control_optimizers
behavior, you can check this work: https://github.com/Nixtla/neuralforecast/pull/1015
(note: this is not included in NeuralForecast library)