Good afternoon, I've noticed that the inference ti...
# timegpt
x
Good afternoon, I've noticed that the inference time varies from 10 seconds to 20 seconds given exactly the same payload, what could be causing this issue?
j
We use serverless infra and the cold start is 10-15s, so you're most likely experiencing that on the requests that take longer
x
I see, that make sense, thanks