Grad value error

Hi Pytorcher!
I was checking grad value of bias w.r.t the last conv-filter.

Here is my code.

myLoss = abs(pred - answer).mean() # 1x1x256x256 → scalar value
print( myLoss.item() )
optimizer.zero_grad()
myLoss.backward(retain_graph=True)
torch.cuda.synchronize()
print( last_layers_bias.grad )

I guess myLoss.item() and last_layers_bais.grad has same value.
But two values are so different, and I geuss this is a bug?

why do you think it should have same value?

I figured it out that i misunderstood.