Accelerate backward for variables with no gradient

Hi all,

I faced a backward problem.

Let’s say there are two Variables x, y, which are calculated by some other functions.

And we have that

z = x * mask + y * (1 - mask),

when mask = 1, we have that z = x * 1 + y * 0. When we do z.backward(), I find that y.grad is 0 instead of None, which means that the backpropagate pass through Variable y. This makes the backward slow.

I wonder that is it possible to prevent the backward pass through y in the above case?

Thank you very much!

you may refer to Autograd mechanics in the docs

y = y.detach()

or

y.requires_grad=False