Hi! I'm getting `KeyError: 'TimeGPT-hi-20'` and ha...
# timegpt
s
Hi! I'm getting
KeyError: 'TimeGPT-hi-20'
and have no idea what this means. Hopefully someone can help point me in the direction of how to solve?
m
Hi @Stephen Cox can you give us more information on what generated this error? if you could share the code you used, we could help you debug
s
Thanks @Mariana Menchero. I did manage to resolve that error but now am getting a similar one:
'TimeGPT-lo-20'
I don't get the error when I comment out the quantiles line:
fy25_pax_preds_df = timegpt.forecast(
df=historical_pax_df,
X_df=future_pax_exogenous_df,
h=15,
finetune_steps=15,
finetune_loss='smape',
time_col='timestamp',
target_col='value',
quantiles = [0.4, 0.6],
freq='MS',
)
Here's the full error:
Copy code
INFO:nixtlats.timegpt:Validating inputs...
INFO:nixtlats.timegpt:Preprocessing dataframes...
WARNING:nixtlats.timegpt:The specified horizon "h" exceeds the model horizon. This may lead to less accurate forecasts. Please consider using a smaller horizon.
INFO:nixtlats.timegpt:Using the following exogenous variables: trading
INFO:nixtlats.timegpt:Calling Forecast Endpoint...
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
~\anaconda3\lib\site-packages\pandas\core\indexes\base.py in get_loc(self, key, method, tolerance)
   3628             try:
-> 3629                 return self._engine.get_loc(casted_key)
   3630             except KeyError as err:

~\anaconda3\lib\site-packages\pandas\_libs\index.pyx in pandas._libs.index.IndexEngine.get_loc()

~\anaconda3\lib\site-packages\pandas\_libs\index.pyx in pandas._libs.index.IndexEngine.get_loc()

pandas\_libs\hashtable_class_helper.pxi in pandas._libs.hashtable.PyObjectHashTable.get_item()

pandas\_libs\hashtable_class_helper.pxi in pandas._libs.hashtable.PyObjectHashTable.get_item()

KeyError: 'TimeGPT-lo-20'

The above exception was the direct cause of the following exception:

KeyError                                  Traceback (most recent call last)
~\AppData\Local\Temp\ipykernel_17372\1034047663.py in <module>
      1 list = [0.4, 0.6]
      2 
----> 3 fy25_pax_preds_df = timegpt.forecast(
      4     df=historical_pax_df,
      5     X_df=future_pax_exogenous_df,

~\anaconda3\lib\site-packages\nixtlats\timegpt.py in forecast(self, df, h, freq, id_col, time_col, target_col, X_df, level, quantiles, finetune_steps, finetune_loss, clean_ex_first, validate_token, add_history, date_features, date_features_to_one_hot, model, num_partitions)
   1135         """
   1136         if isinstance(df, pd.DataFrame):
-> 1137             return self._forecast(
   1138                 df=df,
   1139                 h=h,

~\anaconda3\lib\site-packages\nixtlats\timegpt.py in wrapper(self, *args, **kwargs)
    663                 f'supported models: {", ".join(self.supported_models)}'
    664             )
--> 665         return func(self, *args, **kwargs)
    666 
    667     return wrapper

~\anaconda3\lib\site-packages\nixtlats\timegpt.py in wrapper(self, num_partitions, **kwargs)
    681     def wrapper(self, num_partitions, **kwargs):
    682         if num_partitions is None or num_partitions == 1:
--> 683             return func(self, **kwargs, num_partitions=1)
    684         df = kwargs.pop("df")
    685         X_df = kwargs.pop("X_df", None)

~\anaconda3\lib\site-packages\nixtlats\timegpt.py in _forecast(self, df, h, freq, id_col, time_col, target_col, X_df, level, quantiles, finetune_steps, finetune_loss, clean_ex_first, validate_token, add_history, date_features, date_features_to_one_hot, model, num_partitions)
    821             max_wait_time=self.max_wait_time,
    822         )
--> 823         fcst_df = timegpt_model.forecast(df=df, X_df=X_df, add_history=add_history)
    824         self.weights_x = timegpt_model.weights_x
    825         return fcst_df

~\anaconda3\lib\site-packages\nixtlats\timegpt.py in forecast(self, df, X_df, add_history)
    554             fitted_df = fitted_df.drop(columns="y")
    555             fcst_df = pd.concat([fitted_df, fcst_df]).sort_values(["unique_id", "ds"])
--> 556         fcst_df = self.transform_outputs(fcst_df, level_to_quantiles=True)
    557         return fcst_df
    558 

~\anaconda3\lib\site-packages\nixtlats\timegpt.py in transform_outputs(self, fcst_df, level_to_quantiles)
    246                     col = f"TimeGPT-{hi_or_lo}-{lv}"
    247                 q_col = f"TimeGPT-q-{int(q * 100)}"
--> 248                 fcst_df[q_col] = fcst_df[col].values
    249                 cols.append(q_col)
    250             fcst_df = fcst_df[cols]

~\anaconda3\lib\site-packages\pandas\core\frame.py in __getitem__(self, key)
   3503             if self.columns.nlevels > 1:
   3504                 return self._getitem_multilevel(key)
-> 3505             indexer = self.columns.get_loc(key)
   3506             if is_integer(indexer):
   3507                 indexer = [indexer]

~\anaconda3\lib\site-packages\pandas\core\indexes\base.py in get_loc(self, key, method, tolerance)
   3629                 return self._engine.get_loc(casted_key)
   3630             except KeyError as err:
-> 3631                 raise KeyError(key) from err
   3632             except TypeError:
   3633                 # If we have a listlike key, _check_indexing_error will raise

KeyError: 'TimeGPT-lo-20'
m
Hi @Stephen Cox It's weird that you're using quantiles but the error seems to be related to the prediction intervals. Can you please make sure you have the latest version of
nixtlats
?
s
That's what I figured from the error. I wasn't on the latest version but have updated and still seeing the same issue
I've even fresh installed python, jupyter, and nixtlats on a different laptop and got the same error. Are you able to reproduce?
m
I see. Let me keep thinking about your error because I'm not sure where it is coming from.
and no, I couldn't reproduce your error yet. I tried using a dataset from our docs and didn't have any issues with the quantiles
s
Thanks, I'll try a different dataset. I've used quantiles without a problem before.
m
please let me know if the error persists with other datasets. If not, then we should look more closely to the dataset that you're currently using.
s
I used a subset of the data that generated the error and it worked without a problem 🤔
Must be an issue caused by something within my dataset
💡 1
m
Hey @Stephen Cox if you want, you can share your email with us so that we can help you debug. It seems the error is in your data and we understand that might be proprietary data.
s
Thanks @Mariana Menchero I can be reached at stephen.cox@magicmemories.com
👍 1