is it possible that the target transform using glo...
# mlforecast
j
is it possible that the target transform using globalsklearntransformer inside mlforecast is buggy? I realised when i use min-max scaler during training and make inference, the target is not re-scaled again. but i also dont find a way to access the fitted scaler for reverse trasformation:
Copy code
sk_boxcox = PowerTransformer(method="box-cox", standardize=False)
boxcox_global = GlobalSklearnTransformer(sk_boxcox)

scaler = MinMaxScaler(feature_range=(0, 1))
minmax_global = GlobalSklearnTransformer(scaler)

target_transforms = [minmax_global]

target_transforms=target_transforms
or maybe this is intended and i was never aware. i also feel the documentation isn ot super clear. or is it a bug on my side, how i use it?
j
Can you provide an example? I don't understand if you're using both boxcox and min max and what do you mean by "the target is not re-scaled again". Using a global transformer computes the statistics across all series and uses the same scale for all of them, so maybe that scaling isn't good for the model and it just predicts zero or similar
j
no, this was just same examples. during my tuning i test both, but i dont use the local components from target transform but use the sklearn based solution. but i just saw there is LocalBoxCox in traget tragsform. it should transform the target back during prediction, right?
but the global should also work. evaluation values during tuning also look fine. i wonder if it might be also how the model is saved in mlflow as dill. but i dont get it, it shouldnt matter if its dill, pickle or joblib. but i also had this bug with combine method in lag transform. i will do a couple of tests and let you know tomorrow or so.
j
you can run a quick check by loading your serialized model and checking if the target transforms are there, e.g.
Copy code
mlf = your_load_fn(path)
assert mlf.ts.target_transforms is not None
if that assert fails then the target transformations weren't saved and thus the results will be the raw predictions from the model
j
ah, perfect! amazin
so the min-max scaler is found inside the loaded model:
j
The inverse transform should be applied to the predictions then, you can check its stats with
Copy code
fitted_tfm = loaded_model.ts.target_transforms[0].transformer_
fitted_tfm.data_min_, fitted_tfm.data_max_
j
so using your local boxcox solution works well. but the min max global sklearn based scaler showed predictions being much lower than what was seen during training. not sure what happened, but i had some issues with scaling before but never had the time to investigate further. i will use the local solutions now and unfortunately cant investigate further what happens when using the global solution. it might very well be something on my side, although i applied it as is shown in documentation, but who knows. but thanks so much again jose for your help. you really saved my night again with your tips 🙂 thanks so much