This message was deleted.
# general
s
This message was deleted.
j
Hey. You should be able to do that if you want to, you probably just need to change the dates
1
c
NF supports both historical and lagged features but I’m not sure if a single column can be in both
c
@José Morales Thanks. But changing the dates is not clear to me. Could you elaborate? Indeed in the data, target and exogeneous variables are in the same range of dates. Moreover I have mathematical guarantees that lagged exogeneous variables impact forecasting. My concern is how to use that in the fitting and then in the forecasting. Thanks
j
Which library are you using?
c
Statsforecast and MLforecast. both require having future exogeneous variables
j
In statsforecast I think the dates aren't checked, so you can just provide a dataframe with the correct shape, e.g. if you have 10 series and are forecasting 5 periods it should have 50 rows. In mlforecast the dates are checked, so you'd have to offset the dates in your future dataframe so that the first date is the one immediately after your last training date for each serie.
c
Hi @José Morales, quick question. Do Statsforecast and MLforecast handle non-stationary data? if yes how please?
j
Hey. In statsforecast it depends on the model, the ARIMA model for example will take differences and seasonal differences to try to make it stationary. In mlforecast you can use target transformations
c
Great Thanks
@José Morales thanks for all these points that you make clear. The package is just a pure magic, it speeds up my forecasting diary task. For the target transformations that you have just clarified, I assume that there is back-transformation when forecasts are done. how is it done? is there any guarantee that it's without bias?
j
Hey. To restore the original values we only invert the transformation, e.g. for the standard scaler we multiply by the standard deviation and add back the mean
c
Ok Thanks. I'm asking because I know some simple inverse (like power transform, box_cox not sure for difference) induce bias, that needs to be corrected, in the "original" space. https://robjhyndman.com/hyndsight/backtransforming/ https://github.com/scikit-learn/scikit-learn/issues/15881
j
They don't adjust for bias at the moment
although I think those problems only happen with power transformations like boxcox, the other scalers and differences should be fine
1