This message was deleted.
# statsforecast
s
This message was deleted.
b
As for point 2: We’ve tried running the StatsForecast models on both M1 and Intel based Macbook Pro’s and the result is always the same: the notebook kernel just kind of freezes and it’s left running for several hours (likely compiling?) but nothing ever finishes. When I try it on one of our remote linux servers, it runs quite smooth. Any ideas?
k
You may be trying too many combinations? Are you using default parameters? To avoid Division by Zero, you can add 0.001 to the timeseries Yes you should fill since statsforecast drops missing rows.
What are your specs of your machine?
@Bradley de Leeuw, i don’t know for sure, but maybe you can try turning off parallelism and seeing if it helps? There is a default parameter n_jobs = -1. Try setting it to 1 and see if it helps stability. If it does, I might have other ideas
Can you show me the traceback of the compilation issue?
m
In the Macbook I have 32 GB of RAM. I tried tunning the parameters without luck. What is the approach to work with too many combinations?
k
32 GB is a lot. That should be fine. You’re not using the default parameters right?
m
This is the last msg, in this case I'm running the example of the documentation, and it should run in seconds in the Macbook, but it never finishes. The other model I'm running it in my other PC with 12 GB RAM. (The one with more combinations)
k
Are you familiar with numba? I think if it’s like this, we might just wanna disable it
Based on this issue , there isn’t much we can do here. The best idea I have is to try an environment variable:
Copy code
NUMBA_DISABLE_JIT=1
Or before you import `statsforecast`:
Copy code
import os
os.environ['NUMBA_DISABLE_JIT'] = '1'
Not really sure if it will work, but worth a shot
m
I'll give it a shot, thanks Kevin!
It's difficult to know if it's running, but the message changed. Is there anyway to log the iterations or something like that?
k
maybe
verbose=True
? in the
StatsForecast
m
Let me see, I'm trying it right now, thanks