Is that normal when I try to move DistributedMlFor...
# mlforecast
k
Is that normal when I try to move DistributedMlForecast to local by calling method .to_local() I get an error -
mlforecast/distributed/forecast.py", line 795, in combine_target_tfms [part[i] for part in by_partition] for i in range(len(by_partition[0])) TypeError: object of type 'NoneType' has no len()
j
Which target transformations are you using?
k
RollingMean, SeasonalRollingMean. Same error I had with more complex using Combine and operator
j
Those are lag transforms, target transforms are like differences, scalers, etc
k
lag transforms
No scalers or target diffs
Lag_transforms and lags = [1, 7]
j
I see. The code assumes that if there aren't any target transforms they are an empty list but they're actually None, so that's what's failing. In the meantime, can you try adding a dummy global transformation? e.g.
Copy code
from mlforecast.target_transforms import GlobalSklearnTransformer
from sklearn.preprocessing import FunctionTransformer

dummy_tfm = GlobalSklearnTransformer(FunctionTransformer())
fcst = DistributedMLForecast(..., target_transforms=[dummy_tfm])
That'll just apply the identity and should work while we fix the issue
1
k
Thanks, I will give it a try