Grad backward problem

Hello, I have a Tensor a, and Tensor b, the shapes of which are same. I want to update the a, like a[b < 0] = 1.0. Can it be used in the layer of the networks? I mean can it be recorded by the computation graph when backward ? Thanks so much.


Yes this can be done in a model.
Be carefull though that the gradients for this op wrt b will be 0 everywhere and so you won’t be able to train anything with them. But the gradients for a will be exactly what you expect.