Zac Pullar-Strecker
02/24/2025, 8:37 PMinsample_y
), but I assume there's a decent amount in the nf/dataset/loss side that would need to be modified to support multiple targets as well.
I guess I'd love to hear how difficult you think this modification would be and any tips for diving into it.Marco
02/28/2025, 4:05 PMquantiles
in predict
, allowing for easy quantile retrieval for all DistributionLosses
.
- Mixture losses (GMM, PMM and NBMM) now support learned weights for weighted mixture distribution outputs.
- Mixture losses now support the use of quantiles
in predict
, allowing for easy quantile retrieval.
- Improved stability of ISQF
by adding softplus protection around some parameters instead of using .abs
.
- Unified API for any quantile or any confidence level during predict for both point and distribution losses.
Enhancements
- Improve docstrings of all models
- Minor bug fix in TFT: we can omit specifying an RNN type and the static covariate encoder will still work.
- Fitting with an invalid validation size now print a nice error message
- Add bfloat16 support
- Recurrent models can now produce forecasts recursively or directly.
- IQLoss now gives monotonic quantiles
- MASE loss now works
Breaking Changes
- Unify API
- RMoK uses the revin_affine
parameter instead of revine_affine
. This was a typo in the previous version.
- All models now inherit the BaseModel
class. This changes how we implement new models in neuralforecast.
- Recurrent models now require an input_size
parameter.
- TCN
and DRNN
are now window models, not recurrent models
- We cannot load a recurrent model from a previous version to v3.0.0
Bug Fixes
- Multivariate models do not error when predicting when n_series
> batch_size
- Insample prediction works with series of varying lengths
Documentation
- Big overhaul of the documentation to remove old and deprecated code.
- Add example of modifying the default configure_optimizers()
behavior (use of ReduceLROnPlateau
scheduler)
This release solves many of your pain points and it adds features that were aksed for a long time.
Big thanks to @Olivier for his amazing contribution to this release, as well as to all our users for taking the time to raise issues and ask questions. We'll keep working on improving neuralforecast!Md Atiqur Rahaman
02/28/2025, 6:41 PM田口天晴
03/07/2025, 2:49 PM田口天晴
03/08/2025, 7:09 AMTony Gottschalg
03/14/2025, 10:20 AMBersu T
03/18/2025, 11:31 AMmodelLSTM = AutoLSTM(h=h,
loss=MAE(),
backend='optuna',
num_samples=10)
nf = NeuralForecast(models=[modelLSTM], freq='ME')
nf.fit(df=df_encoded, val_size=18)
Hi. When I do this I get an error stating Exception: Time series is too short for training, consider setting a smaller input size or set start_padding_enabled=True after running for a while, where are we expected to put the argument set_padding_enabled?Bersu T
03/18/2025, 12:31 PMlstm_config = AutoLSTM.get_default_config(h = h, backend="optuna")
def config_lstm(trial):
config = {**lstm_config(trial)}
config.update({
"input_size": trial.suggest_int("input_size", 2, 18),
})
return config
lstm_config
modelLSTM = AutoLSTM(h=h,
config=config_lstm,
backend='optuna',
loss=MAE(),
num_samples=3)
During fitting I get the following error raise:
ValueError("Cannot set different distribution kind to the same parameter name.")
ValueError: Cannot set different distribution kind to the same parameter name.
[W 2025-03-18 122842,864] Trial 0 failed with value None.Sapna Mishra
03/20/2025, 10:18 PMAnkit Hemant Lade
03/21/2025, 2:31 PMAditya Limaye
03/22/2025, 12:50 AMdf
), I have values of the future_exogenous_cols for datetimes in the past, so the model has access to these values in the training pass, and at inference time, I include the future_exogenous_cols
in the "past" dataframe (df
) when i call nf.predict()
-- but is the model actually using these values?
thanks in advance!Ankit Hemant Lade
03/24/2025, 11:07 PMSapna Mishra
03/25/2025, 11:23 PMBersu T
03/26/2025, 8:33 AMnum_samples
and selecting just 4 out of the 176 unique IDs—the training still takes a very long time (about 30 minutes). This becomes even more problematic with the complete dataset. In contrast when using MLForecast training is significantly faster taking only a few seconds. Could you please clarify why this happens and what I could do to mitigate this?Jelte Bottema
03/26/2025, 1:05 PMSarah Unterseher
03/27/2025, 3:50 PMBersu T
03/31/2025, 10:15 AMJonghyun Yun
04/09/2025, 5:14 PMRaj Puneeth
04/10/2025, 9:19 PMJan
04/11/2025, 11:34 PMstep_size
when using the LSTM. Say I need to predict the next 24 hours every hour and I want to use the last 48 hours to do so, and I have future exogenous features that change every hour (for example weather forecasts), and turn into actuals when the time passes beyond the present.
My data frame right now consists of non-overlapping windows of 72 steps long, where the first 48 steps are mostly duplicates, as the actual values of the exogenous features changes only one step at the time. So I'm basically using input_size=48
, horizon=24
and step_size=72
when training an LSTM. However, I'm not sure that I'm doing this right as it seems like the model trains very poorly even though there's a lot of data (for example, the forecasted values rarely start from the last known values), and the predictions on a future hold-out set are very poor.
Am I doing the windowing correctly? Or should I be feeding only 25 hour windows to the model (so input_size=1
, horizon=24
and step_size=25
) where the first row are the latest actuals and have the LSTM do the tracking of the past? And is this different for other architectures such as NHITS?Bersu T
04/15/2025, 7:59 AMJelte Bottema
04/15/2025, 11:21 AMChristiaan
04/22/2025, 7:48 AMBethany Earnest
04/23/2025, 4:09 PMJonathan Mackenzie
04/24/2025, 4:49 AMnf.core.NeuralForecast.fit()
, is there a reason we cannot set the size of the test set?Renan Avila
04/24/2025, 10:37 PMJonathan Mackenzie
04/29/2025, 3:04 AMJoaquin FERNANDEZ
05/06/2025, 3:46 PMRodrigo Sodré
05/07/2025, 12:32 PM# Y_df = Y_df.query("unique_id == 'H1'")[:700]
Then i got the attached images. Does anyone knows what are those crossed lines?# Y_df
Christiaan
05/08/2025, 7:40 AM