This message was deleted.
# neural-forecast
s
This message was deleted.
k
Hey @Dawie van Lill, Thanks for using NeuralForecast. Here is the hyperparameter tuning documentation where we explain it: https://nixtla.github.io/neuralforecast/common.base_auto.html Some ideas to be able to compare other methods: 1. If the models are not Pytorch-Lightning classes, you will need to write your own hyperparameter optimization functions, that homogeneize the inputs and outputs of the models and its evaluation and use directly a library like hyperopt. 2. If the models are Pytorch-Lightning classes you might be able to feed them into the
BaseAuto
, and you could become an important contributor to NeuralForecast's model collection. What are the models that you are looking forward to compare?
🔥
d
The other models are ML based (SVR, RF, etc). We would have liked to use mlforecast, but it doesnt currently support the inclusion of historical and future exogenous variables as far as I can tell.
k
One of the main advantages of NeuralForecast over MLForecast and alternatives is the automatic featurization, that avoids the hastle of feature engineering. But in principle you should be able to feed the exogenous like in a classic XGB regression to model this problem:
As long as the rows of the forecast creation date match with your features, I think you should be able to use MLForecast.
On another note, the distinction between historic and future exogenous variables is fairly new innovation that very few forecasting models make. Which is another NeuralForecast advantage.
t
@Kin Gtz. Olivares do you know if there are any plans to add more 'featurization' to MLForecast? There are some pretty basic things that can extend it and gives good performance boosts.
k
Hey @Tyler Blume , If you have some ideas or feature requests add them here https://github.com/Nixtla/mlforecast/issues We are happy to include them