This message was deleted.
# neural-forecast
s
This message was deleted.
f
NHiTS uses different blocks to reconstruct different frequencies in the signal and it does this locally along the horizon based on the info in the NHiTS article. So you don't need to do anything about non-stationarity or seasonality. I think other models may deal with these as well but in different ways. I think the point of using deep learning models as opposed to statistical methods is that they handle these features out of the box. But I haven't used others besides nhits and nbeats so don't quote me on those.
a
great thanks!
g
I'm really not sure if that's true? Imagine you are doing a time series which starts at 0 and is continually growing in value (stock values?). If you don't difference the values and make them stationary, the average absolute errors near the start will be tiny and the errors near the end could be exponentially larger. I don't think that is what you want is it? (I'm a beginner by the way, but it makes sense to me. Having stationary data seems to be 'right' ) Love to get some more experienced input here.
c
Hi @Aman Singh. For most models and applications you don't need to have stationary series for deep learning methods, this condition is mostly required by some "statistical" autoregressive models (ARMA for example). Most deep learning models will have modules specifically to include trends and seasonalities.
๐Ÿ‘ 1
Regarding your point @GohOnLeeds, all models in the library have two types of normalization (temporal local scalers with the
scaler_type
parameter) and global scaling (at the
core
class). These scalers help to prevent exploding/vanishing gradients, and homogenize the input when time series have large scales.
Also some training losses, like MAE and HuberLoss, are robust to the scale of the data, for the case of increasing absolute errors. For evaluation, in most cases we scale back the time series to the original scale. For cases where absolute errors have a wide range between series or time, you can use a scale-invariant metric like MAPE, sMAPE, and others.
๐Ÿ™Œ 1
m
Hello @Cristian (Nixtla) , sorry but it is not clear to me when you say "all models in the library have two types of normalization (temporal local scalers with the scaler_type parameter) and global scaling (at the core class)." --> I am following the ยง3 of the tutorial https://nixtlaverse.nixtla.io/neuralforecast/examples/time_series_scaling.html but applying it to NHITS but the local_scalar_type is not available!
from neuralforecast.models import LSTM, NHITS, RNN
from neuralforecast.core import NeuralForecast models = [NHITS(h = ....)nf = NeuralForecast(models=models, freq='B', local_scalar_type='standard')
why is it different from the tutorial? that is the core class right? please advice . Thank you
c
Hi! Yes, the local scaling is specified in the model and the global in the core class
You don't need to add the local_ in the hyperparameter