Mini-batch gradient and Loss.backward()

Duplicate question. Answered there: Is Loss.backward() function calculate gradients over mini-batch