This message was deleted.
# general
s
This message was deleted.
m
Hi @Korek Testowy! Thanks for reaching out. Which library are you using?
k
@Max (Nixtla), statsforecast.distributed.forecast
@Max (Nixtla), do You have any advice on this? 🙂
@José Morales, could You support please? As far as I know there is no statsforecast.distributed.fit and statsforecast.distributed.predict.
m
I'm so sorry about not answering before @Korek Testowy. The
forecast
method is completely compatible with distributed clusters, so it does not store any model parameters. (It would be expensive and slow to store local parameters accross clusters and then send them over different machines). If you want to store parameters for every model you can use the
fit
and
predict
methods. However, those methods are not defined for distrubed engines like Spark, Ray or Dask. So, yes, indeed, statsforecast does not support distributed .fit and .predict. The intuition behind is that with statistical methods (local models) it would probably make more sense to always retrain before forecasting. If you want to use pretrained global models for forecasting and save on retraining, we would probably recommend mlforecast.
Particularly, I would suggest you check the
.update
method. Here is a tutorial: https://nixtla.github.io/mlforecast/docs/end_to_end_walkthrough.html#updating-series-values