Slackbot
12/20/2023, 8:28 AMJosé Morales
12/20/2023, 4:26 PMinput_size
allows you to define the size of the training set to build a sliding window, which will advance step_size
periods each time. If you don't specify input_size then they will use an expanding window.
• We don't have any functions to ensemble models but you can follow this tutorial and use a similar logic to choose weights for an ensemble.
• mlforecast provides an update method that will update the series so that if you call predict again it uses those, however the models aren't retrained. If you want to update the model you could for example use lightgbm and perform incremental learning, similar to what we do in the custom training guide.Galvan Goh
12/21/2023, 7:54 AMinput_size
allows you to define the size of the training set to build a sliding window, which will advance step_size
periods each time. If you don't specify input_size then they will use an expanding window.
• In a scenario where I have a data set that contain series of 2 product sales which are of different length (e.g. product A has 12 rows of data, product B has 72 rows of data). I want to control both input_size
and step_size
in the cross validation process of each series (e.g. 3 folds for product A and 10 folds for product B), and at the same time, want to use a single model to train both of the series. Is that possible?
• From the cross validation guide here it shows a standard and same sliding window configuration applied to each series in df.
> mlforecast provides an update method that will update the series so that if you call predict again it uses those, however the models aren't retrained. If you want to update the model you could for example use lightgbm and perform incremental learning, similar to what we do in the custom training guide.
• Incremental learning is what I'm looking for. If I understand the guide correctly, by providing sample_weights
of a lightgbm model trained on previous month (lgbmr_old) when training a new lightgbm model trained this month (lgbmr_new), the fitting process will be much faster because lgbmr_new is just training new data points. Since sample_weights
are provided, lgbmr_new will not be training on old data.
• How can I get the weights from a trained model?José Morales
12/22/2023, 4:10 PMinit_model
argument of fit in LightGBM for example, that way you can train the model some more iterations with your new data.
• The trained models are stored in the MLForecast.models_
attribute, so you can use their interfaces to get whatever you need