Hello team! We just made a new release of neuralfo...
# squads
m
Hello team! We just made a new release of neuralforecast with some pretty important changes: New features - New model: TimeXer, a Transformer-based model specifically designed to handle exogenous features - All losses compatible with all types of models (e.g. univariate/multivariate, direct/recurrent) OR appropriate protection added. - DistributionLoss now supports the use of
quantiles
in
predict
, allowing for easy quantile retrieval for all
DistributionLosses
. - Mixture losses (GMM, PMM and NBMM) now support learned weights for weighted mixture distribution outputs. - Mixture losses now support the use of
quantiles
in
predict
, allowing for easy quantile retrieval. - Improved stability of
ISQF
by adding softplus protection around some parameters instead of using
.abs
. - Unified API for any quantile or any confidence level during predict for both point and distribution losses. Enhancements - Improve docstrings of all models - Minor bug fix in TFT: we can omit specifying an RNN type and the static covariate encoder will still work. - Fitting with an invalid validation size now print a nice error message - Add bfloat16 support - Recurrent models can now produce forecasts recursively or directly. - IQLoss now gives monotonic quantiles - MASE loss now works Breaking Changes - Unify API - RMoK uses the
revin_affine
parameter instead of
revine_affine
. This was a typo in the previous version. - All models now inherit the
BaseModel
class. This changes how we implement new models in neuralforecast. - Recurrent models now require an
input_size
parameter. -
TCN
and
DRNN
are now window models, not recurrent models - We cannot load a recurrent model from a previous version to v3.0.0 Bug Fixes - Multivariate models do not error when predicting when
n_series
>
batch_size
- Insample prediction works with series of varying lengths Documentation - Big overhaul of the documentation to remove old and deprecated code. - Add example of modifying the default
configure_optimizers()
behavior (use of
ReduceLROnPlateau
scheduler) This release solves many pain points from our users and it adds features that were aksed for a long time. Most of these features come from @Olivier Sprangers massive PR so many thanks for that and all your work on the documentation.
❤️ 5
👏 1