This message was deleted.
# neural-forecast
s
This message was deleted.
❤️ 1
c
Hi @Manuel Chabier Escolá Pérez! Can you remove the line with
%%capture
so that it prints where is running out of memory? The problem is more likely that the horizon is large (192) and the
input_size
is 5 times that. Plus all the 47 exogenous variables for the same window size. Some suggestions are: • Reduce the input size (1,2, or 3 times the horizon) • Reduce the
batch_size
,
windows_batch_size
,
valid_batch_size
, and
inference_windows_batch_size
. • Make your dataset smaller, by either removing the less informative exogenous variables or use only the latest data. The GPUs in Google Colab are usually very small, with very limited RAM, so you will be limited in the model's or data's size. The ideal case is to use a larger GPU, we use AWS EC2 instances. And iterate, reduce the previous hyperparameters and size of data until it fits in the memory of your GPU.