Matej
10/05/2023, 9:39 PMfcst = MLForecast(
models = models,
freq = '15T',
target_transforms = [LocalStandardScaler()],
# lags = np.arange(l , l + n).tolist() + np.arange(l + 94 , l + n + 94).tolist(),
# lag_transforms is a dictionary where the key is the lag and the value is a list of transformations to apply to that lag.
# Here, 1: [expanding_mean] means apply the expanding_mean transformation to the lag 1.
# And 24: [(rolling_mean, 48)] means apply the rolling_mean transformation with a window of 48 to the lag 24.
lag_transforms = {
l: [expanding_mean],
l: [(rolling_mean, 192)],
},
num_threads = -1,
)
Is there an example / tutorial in the docs of combining this with for example PCA for the exogenous, X inputs ?
Thanks : )Matej
10/05/2023, 9:54 PMGlobalSklearnTransformer(BaseTargetTransform):
José Morales
10/05/2023, 10:24 PMMatej
10/06/2023, 6:59 AMJosé Morales
10/06/2023, 7:39 PMfrom sklearn.compose import ColumnTransformer
from sklearn.decomposition import PCA
from sklearn.pipeline import make_pipeline
pca = ColumnTransformer([('pca', PCA(), ['exog1', 'exog2'])], remainder='passthrough')
model = make_pipeline(pca, your_model)
fcst = MLForecast(model=model, ...)