Hey, I am creating custom lag transform using Numb...
# mlforecast
m
Hey, I am creating custom lag transform using Numba. Currently I am excperiencing longer execution time. I want to time how long will my custom lag transform run using my data when training. Do you have any idea how should I do this?
j
maybe this is oo simple, but cant you just use some hard coded params and then the only thing different is your custom lag transforms? or you build a timer around the lag transform, but then i guess you would a sum for all the times the function was called?
m
I have tried similar processes, but Numba cannot read the
time
library when using
nopython
mode. If I were to use
object
mode, this would increase the execution time due to interpretation by python. Then what I am trying to measure are invalid.
j
can you just use process outside tuning / training and just put a time outside mlf.process(custom_lag_transform) so you can see how long it takes just to calculate it?
a lof of just in the sentence above 🙂
m
Oh, my bad for the misunderstand. I am putting that as my last resort since it would means making a new code. I am hoping if it's possible to modify the custom lag transform instead to prevent breaking changes.
j
mhh, then i dunno 🙂 too bad numba doesnt accept
time
m
Okay, thanks a lot for discussing with me and giving ideas 👍