Hi, I have a question regarding training times. Ne...
# neural-forecast
b
Hi, I have a question regarding training times. NeuralForecast is supposed to train global models, but even after simplifying the model significantly—using only 2
num_samples
and selecting just 4 out of the 176 unique IDs—the training still takes a very long time (about 30 minutes). This becomes even more problematic with the complete dataset. In contrast when using MLForecast training is significantly faster taking only a few seconds. Could you please clarify why this happens and what I could do to mitigate this?
j
Which algorithms do you use and which params? Could be that during training very deep Networks with a lot of layers are trained. Let’s say you use an lstm start with 2 or 3 layers to begin with. And then performance should be faster hopefully, plus deeper networks would most likely be too much for your data set.
b
I am using AutoNHITS' default search grid
j
I don’t know the default search grid. Maybe you check in the documentation the number of layers they test