I have the following problem: I have implemented 7 layers neural network. My Loss function contains 2 parts, the 1st part is the cross entropy of classification problem which is easy to implement and calculate the backward process.
However, for the 2nd part, my loss is related to the output of the 6th layer, which is an intermediate feed forward result.
May I ask how to calculate the loss of the 2nd part, and combine with the 1st part to do BP?
Hi, I tried loss = loss1 + loss2 loss.backward() However, the loss1 doesn’t decrease after backprop while loss2 can drop. Could you tell me how to solve this problem?