Chris Gervais

07/04/2023, 9:24 PM
it looks promising as a foundation model for transfer learning

Cristian (Nixtla)

07/05/2023, 12:40 PM
Thanks for the insight! We presented our work on transferability of different architectures at the International Symposium of Forecasting. We found that most architectures can work reasonably well (nhits, tft, pathtst, lstm, tcn), outperforming autoarimas and other baselines trained directly in the target dataset.
It would be interesting to also include TimesNet in the comparison

Chris Gervais

07/05/2023, 1:00 PM
very cool, any chance there’s a recording of that? also curious if this was done on global models

Pascal Schindler

07/05/2023, 5:50 PM
@Cristian (Nixtla) Would is be interested in your insights 🙏