Hey all! I'm using neuralforecast for our finance...
# neural-forecast
p
Hey all! I'm using neuralforecast for our finance time series data. I'm using a bunch of AutoModels for hyperparameter tuning.
Copy code
nf = NeuralForecast(
            models=self.models,
            freq=self.freq
        )
Here, self.models is a list of AutoModel instances. When I then run
nf.fit(df=train_df)
, all the models train in sequence. Since I'm using around 10 models, it is a bit difficult to track which model is being trained. So, I was wondering if there was a way to log/print the model that is being trained. I understand that I can just run my own for loop for model training and prediction, however, I was just wondering if there was a native way to do this. Thanks!
j
@Ask Inkeep Please help to answer the original question by @Pannaga S
a
Thanks for tagging me @Jing Qiang Goh. I'll look through relevant sources and get back to you shortly.
Based on the ​*NeuralForecast*​ library documentation, there isn't a built-in native way to log or print which model is currently being trained when using the
NeuralForecast
class with multiple AutoModels (1) (2) . The
NeuralForecast
class processes the models sequentially when you call the
fit
method, but the documentation doesn't show any parameters or methods to enable model-specific logging during training (2) (3) . For tracking which AutoModel is currently training, you would need to implement your own logging solution by training the models individually in a loop rather than passing them all to the
NeuralForecast
class at once. The AutoModels like
AutoNHITS
,
AutoLSTM
, etc. do support a
verbose
parameter that can be set to
True
to track progress, but this controls the hyperparameter optimization progress within each individual model rather than indicating which model in your collection is being trained (4) (3) .
Copy code
model = AutoNHITS(h=12, verbose=True)
(4) If you need to track which model is training, implementing your own training loop would be the most straightforward approach.