Hi everyone! I was curious to see if anyone has tr...
# general
k
Hi everyone! I was curious to see if anyone has tried using any of the Nixtla packages with the lag-llama model?
m
Hello! I am not sure what you mean by using Nixtla with Lag-Llama, but we have benchmarked it against other methods and you can see the fully reproducible results here: https://github.com/Nixtla/nixtla/tree/main/experiments/lag-llama
k
Awesome, thanks @Marco!
@Marco Looking through the experiment you shared, I notice that you do not use mlforecast/statsforecast/etc to train or predict for lag_llama. Was there any reason that you chose to do this?
m
It is to reproduce the same conditions as presented in the original paper of Lag-Llama.
k
Ok gotcha, that makes sense. Outside of reproducing those conditions, would it be possible to run this model with MLForecast? For example, maybe in a way similar to this example? https://nixtlaverse.nixtla.io/mlforecast/docs/how-to-guides/custom_training.html If there are any other examples or advice you have, please feel free to share!
m
We have included some other foundation models in neuralforecast, but we don't plan on doing it for Lag-Llama, at least to my knowledge. However, PRs are welcome, and you can always take some inspiration from how we integrated Time-LLM: https://github.com/Nixtla/neuralforecast/blob/main/nbs/models.timellm.ipynb
k
Thanks so much!