This message was deleted.
# neural-forecast
s
This message was deleted.
c
Hi @Phil. A couple of questions: • Are you using the latest code from the main branch (with today's PR), or the pip version? • Do you have "short" time series? In which total length < input_size +h ? • Have you tried larger
mlp_units
? 16 units is an extremely small network.
p
Hi Cristian, • In these experiments, I am not using the latest version. Unfortunately, I am limited to pip version for now. specifically version 1.6.1. • The length of the ten timeseries in the training set is 676 days. In this case, the input size is 360 and horizon is 180. I have a test portion of 270 days. • I did try a larger number of mlp_units but the perfomance decreases. For reference, I hashed out the names of the timeseries and messed around with the scale of the numbers so I don't get in trouble but the general look of the timeseries look like this.
From my experiments, the training loss decreases most from adding blocks. The higher the number of blocks the lower the training loss.
c
the validation set is just a classic split right? After the training set of these variables
p
That's right. I simply call
nf.fit(df=Y_train_df.reset_index(), val_size=180)