https://github.com/nixtla logo
#neural-forecast
Title
# neural-forecast
f

Frank Lu

11/10/2023, 2:00 AM
I suggest the neural-forecast add a new feature that support the trained model to be saved as onnx model to improve the model inference speed. Then the package could be more helpful in practice.
c

Cristian (Nixtla)

11/11/2023, 5:47 PM
Hi @Frank Lu! Thanks for the suggestion! Have you tried the current version? Are inference times longer than you need for you application?
f

Frank Lu

11/12/2023, 6:16 AM
@Cristian (Nixtla) Yes, my current version of neural-forecast is 1.6.4. I have a hist length of 60 and forecast length of 20 task to do. And the hist_exog_list number is 17 and future_exog_list number is 1, the output feature number is 1. Using GPU to do model inference, the time consumption for each sample forecast is 0.3 seconds. My requirements is no more than 0.02s.