This message was deleted.
# mlforecast
s
This message was deleted.
t
the lags parameter requires a list of ints, so if passing the same ints from pacf to mlforecast in an explicit non-variable form works then there is probably something weird with the variable itself. Maybe it's a list of lists or something? As for the transformation tuning you absolutely want to tune for that, you just need to split your data before you hand it off to transform to prevent leakage. Not 100% on if lightgbmcv will protect you from that and it's a good question for @José Morales . the fourier order 'k' should probably also be tuned for. Seems like you have weekly data so 10 is probably ok to start out with and I wouldn't go lower than 5 or higher than 15. Also I personally have better luck not over-differencing so maybe stick to just Differences[1] but of course I would tune for it if possible!
👍 1
d
@Tyler Blume yeah I am going to dig into the way I’m creating that list it goes from a numpy array to a list and I think that’s where the behavior is getting weird (found the fix! Had to create a list comprhension ensuring data types of the lags were all ints 🫠) Thank you for the feedback on tuning with the transformations my gut instinct was to include them in the tuning but wasn’t sure due to the articles I was reading so thank you so much for the feedback on that. (Going to research some useful lag transformations for weekly data). I split my data before even defining my objective so I should be good against leakage. Going to tune with them in now and check out the results. I thought of tuning with different differencing like you said using optuna.suggest_categorical() and define a dictionary for optuna to use and pull from. Also thank you so much for the ideas on the Fourier features definitely going to try to test them out! My data is weekly with about 4 years worth of timestamps.