This message was deleted.
# neural-forecast
s
This message was deleted.
k
Hey @Manuel, You have an interesting challenge. Here are some ideas: 1. One thing that we can try is to use static variable encoders. In its current version, NBEATSx is using the static exogenous variables without encoding them first: https://github.com/Nixtla/neuralforecast/blob/main/neuralforecast/models/nbeatsx.py#L224 2. The [TFT architecture](https://nixtla.github.io/neuralforecast/examples/forecasting_tft.html) already has them implemented: https://github.com/Nixtla/neuralforecast/blob/main/neuralforecast/models/tft.py#L141 The downside with the TFT architecture will be its speed, for which I recommend you use a Google Colab GPU for your experiments. If using the encoded static exogenous improves your predictions, please let me know so that we improve NBEATSx encoders.
m
@Kin Gtz. Olivares Thanks! If I use the TFT model and set a high number of training steps (30-40 epochs), the model is somehow able to predict the winter peak that the global NBEATSx model could not predict, even if the predicted values are underestimated. Interestingly, if I train NBEATSx for the same large number of epochs, it is still unable to predict the peak. I do not know whether the difference is due to the different handling of exogenous variables or to the model architecture itself.
k
It is in the architecture The NBEATSx does not have a static encoder While TFT does
You can help NBEATSx and TFT by feeding a SeasonalNaive “anchor” prediction that catches the winter peaks, if you have enough information for your series
👍 1