I appreciate your reply
In this case
torch.autograd.backward([output],[grad])
Is grad of each data in minibatch accumulated like bellow?
Thanks
I appreciate your reply
In this case
torch.autograd.backward([output],[grad])
Is grad of each data in minibatch accumulated like bellow?
Thanks