One batch many backpropagation

if i have batch = 100, can i make backpropagation for each sample of 100 individually? That is, the forward pass once, backpropagation 100 items?

You can. Pytorch typically computes element-wise loss, then average and then backprop.

At the time of defining the loss just choose the proper option not to compute the average and loss will return 100 elements instead of a single number.

Then just backprop each element like element.backward()
You will have to set retain_graph=True .

for element in loss:
element.backward(retain_graph=True)

1 Like

if i use MSELoss, which is the correct option?

“none” does not work with the batch.
Leave only the “sum” and “mean”, but they will not give the desired result. Most likely I will have to write my loss function.

It’s not necessary to use for-loop,
mse_criterion = torch.nn. MSELoss (reduction=‘none’ )
loss = mse_criterion(input, label) # input->BxCxHxW, loss->Bx1 or B,
loss.backward(gradient=torch.ones_like(loss))
Then, loss.grad will be individual grad for each sample.

1 Like