A question from a user with a few components. <@U0...
# support
t
A question from a user with a few components. @Yibei could you please help with an answer? Thank you so much!
Hello, I have three questions I would like to ask for advice: (1) After fine-tuning, which aspects of the model are updated to find the optimal model after adjusting the loss function and iteration number? (2) When predicting stock prices, I used two methods of building a sliding window, one is in the form of rolling, and the other is expanding. Why do the final prediction results of the two methods turn out to be exactly the same? (3) How do you control randomness? Why do the same input yield exactly the same prediction results twice?
1
y
Hi Tracy! I drafted an answer: 1. During fine-tuning, the model weights are adjusted to better fit your specific dataset, allowing the model to learn from the nuances of your data. 2. In general, if the input is the same, the output will also be the same. However, it’s probably because of the window size used in your predictions. Since TimeGPT has a limit on input data size, earlier data may get truncated in both expanding and rolling window methods, potentially affecting the predictions. 3. The same input will consistently yield the same output. This behavior is related to aspects of the model structure, which we cannot disclose.
t
thank you so much!