Mini-batch gradient and Loss.backward()

I am confused that Loss.backward() calculate the gradients many times over the samples in the mini-batch or just calculate it one time?
and if I only want to update the weights according to the gradients of the subset samples in the batch, what can I do? thanks

Duplicate question. Answered there: Is Loss.backward() function calculate gradients over mini-batch