If terms cancel out, is the gradient calculated for those terms?

Say I have tensors a, b, which are the result of functions that have gradients, and compute tensor c:

c = a + b - b

Then, I call

c.backward()

Will the gradient for b still be calculated? Or is Pytorch doing something symbolically smart under the hood during Autograd?

Thanks!