Hey Everyone,
I have a question regarding an implementation for sample weighting inside the NixtlaLoss for NeuralForecaster (or rather the neural models). Let me clarify what I mean by this:
1. Introducing a weight for each observation coming from a specific unique ID;
2. The weight is applied in the loss function calculation such that losses for a specific unique ID are more strongly taken into account, then losses for other data points coming from a different unique id
Why?
When training a "global" model in the sense that it is trained on multiple different time series, I want to be able to put more emphasize on a certain target ID during training. I already implemented oversampling, which should be equivalent, but introducing a sample weight would give one more control.
What I already tried / I am aware of
I need to introduce a custom loss via the BasePointLoss class, but the main issue I see is that I don't see how to pass a weight tensor to this loss function, as the model fit values don't accept additional arguments (which could be hacky anyway given how the input is likely transformed into batches). I also thought about utilizing the mask argument, but since it is also internally set in the BaseModel class, I don't see how I could utilize it.
I'm also aware of sample weighting for MLForecaster models but I would like to enable this for NeuralForecaster models in our use case.
Has anyone an idea how to enable this (without having to modify the source code)? Thank you very much in advance and please tell me if something is unclear.