Marco
02/28/2025, 4:05 PMquantiles
in predict
, allowing for easy quantile retrieval for all DistributionLosses
.
- Mixture losses (GMM, PMM and NBMM) now support learned weights for weighted mixture distribution outputs.
- Mixture losses now support the use of quantiles
in predict
, allowing for easy quantile retrieval.
- Improved stability of ISQF
by adding softplus protection around some parameters instead of using .abs
.
- Unified API for any quantile or any confidence level during predict for both point and distribution losses.
Enhancements
- Improve docstrings of all models
- Minor bug fix in TFT: we can omit specifying an RNN type and the static covariate encoder will still work.
- Fitting with an invalid validation size now print a nice error message
- Add bfloat16 support
- Recurrent models can now produce forecasts recursively or directly.
- IQLoss now gives monotonic quantiles
- MASE loss now works
Breaking Changes
- Unify API
- RMoK uses the revin_affine
parameter instead of revine_affine
. This was a typo in the previous version.
- All models now inherit the BaseModel
class. This changes how we implement new models in neuralforecast.
- Recurrent models now require an input_size
parameter.
- TCN
and DRNN
are now window models, not recurrent models
- We cannot load a recurrent model from a previous version to v3.0.0
Bug Fixes
- Multivariate models do not error when predicting when n_series
> batch_size
- Insample prediction works with series of varying lengths
Documentation
- Big overhaul of the documentation to remove old and deprecated code.
- Add example of modifying the default configure_optimizers()
behavior (use of ReduceLROnPlateau
scheduler)
This release solves many of your pain points and it adds features that were aksed for a long time.
Big thanks to @Olivier for his amazing contribution to this release, as well as to all our users for taking the time to raise issues and ask questions. We'll keep working on improving neuralforecast!Tyler Nisonoff
02/28/2025, 6:39 PMOlivier
02/28/2025, 6:44 PMEzdrasz Golaszewski
05/12/2025, 7:13 AMOlivier
05/12/2025, 7:29 AMEzdrasz Golaszewski
05/12/2025, 7:36 AMOlivier
05/12/2025, 7:51 AM