Hi community, is there a way to pass `lr_scheduler...
# neural-forecast
s
Hi community, is there a way to pass
lr_schedulers
to the models? Possibly through the
trainer_kwargs
argument? E.g., torch.optim.lr_scheduler: OneCycleLR to be used within a NBEATSx model. I gave it some attempts without success.
j
I think the package only supports passing the choice of optimizer for now https://github.com/Nixtla/neuralforecast/pull/901 But not the scheduler for the optimizer. If the Nixtla team thinks this is a good feature to enhance, we could consider tracking this request in an issue
s
Interesting. I was thinking that as
trainer_kwargs
enables the passing of pytorch lightning trainer arguments and as various lr schedulers are implemented in pytorch lightning it might be possible to pass lr_schedulers through this (trainer_kwargs) attribute of the models? 💡🤔
j
@Steffen I think
trainer_kwargs
refer to the arguments suggested by https://lightning.ai/docs/pytorch/stable/api/pytorch_lightning.trainer.trainer.Trainer.html which include
accelerator
,
enable_checkpointing
However,
lr_scheduler
behavior needs to be customized via
def configure_optimizer
as shown in https://lightning.ai/docs/pytorch/stable/api/pytorch_lightning.trainer.trainer.Trainer.html#lightning.pytorch.trainer.[…].estimated_stepping_batches which requires a different treatment cc: @José Morales
@Steffen https://github.com/Nixtla/neuralforecast/pull/998 has been merged and you can install the library from source, to pass
lr_scheduler
and
lr_scheduler_kwargs
to use your own lr_scheduler. This support should cover most models including NBEATSx If you want to have full control of
control_optimizers
behavior, you can check this work: https://github.com/Nixtla/neuralforecast/pull/1015 (note: this is not included in NeuralForecast library)
🙌 1