Say I have tensors a
, b
, which are the result of functions that have gradients, and compute tensor c
:
c = a + b - b
Then, I call
c.backward()
Will the gradient for b
still be calculated? Or is Pytorch doing something symbolically smart under the hood during Autograd?
Thanks!