Two Outputs RNN - Backprop Two Different Loss

Hello there,

First of all, since I am quite new to PyTorch, bear with me!

I have a LSTM architecture who outputs the 2 last timesteps. I compute a different loss for both time step, loss1 and loss2, through the same module. I want to backpropagate loss1 from timestep1 and loss2 from timestep2.

How to I achieve this? From the backward function of my LSTM module or from the backward function of the Loss module?

Thanks a lot!

Xem’

I assume you would like to calculate the gradients for a few time steps.
If that’s the case, you could probably call:

loss1 = ...
loss2 = ...
loss1.backward(retain_graph=True)  # doesn't clear the intermediates
loss2.backward()

Note that this procedure will accumulate the gradients.