This message was deleted.
# neural-forecast
s
This message was deleted.
c
Hi @Valeriy! It stopped because the max steps is 200. Pytorch Lightning prints outputs based on epochs and paints the bar red when training stops in the middle of an epoch.
Yes you need to clip negative values afterwards. Alternatively, you can try training with the Poisson distribution loss or the PMM, which always restrict to positive values.
🙌 1
v
@Cristian (Nixtla) thank you - is there an easy way to specify optimal training length? Should I increase max steps - in such case would it apply stopping?
c
Yes, increase
max_steps
to at least 1k. If you have several thousands increase it further, to 5k-10k. This in our experience is more than enough for the NBEATS/NHITS. To add validation you need to specify a
val_size
on the
fit
method. And you can control how often the validation loss is computed with parameter
val_check_steps
of the model.
gratitude thank you 1
v
Great, many thank @Cristian (Nixtla)