Hi there! We want to compare forecasting methods ...
# neural-forecast
d
Hi there! We want to compare forecasting methods including some that are not implement in neuralforecast and want to treat these on a similar basis. One of the key aspects is to ensure that the hyperparameter tuning is done in a consistent manner. For the methods in neuralforecast we are using the automatic tuning functionality to find good hyper parameters and we would like to apply the same/similar methodology to the other methods. Good you provide some information/link on the hyper-parameter tuning policy? For example, how are training/validtion/test sets selected and processed? Or do you tune based on backtesting results?
k
Hey @Dawie van Lill, Thanks for using NeuralForecast. Here is the hyperparameter tuning documentation where we explain it: https://nixtla.github.io/neuralforecast/common.base_auto.html Some ideas to be able to compare other methods: 1. If the models are not Pytorch-Lightning classes, you will need to write your own hyperparameter optimization functions, that homogeneize the inputs and outputs of the models and its evaluation and use directly a library like hyperopt. 2. If the models are Pytorch-Lightning classes you might be able to feed them into the
BaseAuto
, and you could become an important contributor to NeuralForecast's model collection. What are the models that you are looking forward to compare?
🔥
d
The other models are ML based (SVR, RF, etc). We would have liked to use mlforecast, but it doesnt currently support the inclusion of historical and future exogenous variables as far as I can tell.
k
One of the main advantages of NeuralForecast over MLForecast and alternatives is the automatic featurization, that avoids the hastle of feature engineering. But in principle you should be able to feed the exogenous like in a classic XGB regression to model this problem:
As long as the rows of the forecast creation date match with your features, I think you should be able to use MLForecast.
On another note, the distinction between historic and future exogenous variables is fairly new innovation that very few forecasting models make. Which is another NeuralForecast advantage.
t
@Kin Gtz. Olivares do you know if there are any plans to add more 'featurization' to MLForecast? There are some pretty basic things that can extend it and gives good performance boosts.
k
Hey @Tyler Blume , If you have some ideas or feature requests add them here https://github.com/Nixtla/mlforecast/issues We are happy to include them