I’m trying to perform a classification task on time series data. I want to first predict the continuation of the time series and then classify all timeslots. I’m using a default PyTorch transformer to continue the time series and I think the transformer has too much liberty with its results if only the classification in the end is considered for the loss.
Example Input: (1,2,3,4,5)
Example Continuation: (6,7,8)
Example Classification: (True, False, True) # simple example: just classify all even numbers as true
Can I somehow create two different losses and combine them? I’m using BCEWithLogitsLoss for the classification, which one should I use for the transformer? How do I combine them?
If define two loss functions in your script, you can just compute the losses individually and add them together. Autograd will backprop through both loss functions without a problem.
If define two loss functions in your script, you can just compute the losses individually and add them together. Autograd will backprop through both loss functions without a problem.
So literally just total_loss = loss1 + loss2? Nice, thank you.
Any recommendation for the type of loss function that is desirable for the transformer/time series part?
I see, thank you. Would I usually have to reduce the learning rate when adding more loss? I already somewhat tuned my hyper params and I’m wondering if I just change them again.