Hi. Suppose that I have 3 tensors, a
, b
and c
, and a threshold thresh
and I would like to define the tensor d
as follows:
d[a>thresh] = b[a>thresh]
d[a<thresh] = c[a<thresh]
a, b, and c are in our case tensors with autograd = true that depends on an input x
How can I perform the following operation such that the gradient of d
with respect to x
will be computed correctly.
Basically, if I have three functions f,g,and h taking an input tensor x and outputting another tensor, and I have
def forward(self, x):
a=f(x)
b=g(x)
c=h(x)
d=torch.zeros(self.shape, requires_grad=true)
d[a>thresh] = b[a>thresh]
d[a<thresh] = c[a<thresh]
return d
I would like d to be differentiable with respect to x and the parameters of f (for example, f can be a convolution), without implementing my own backward function
How can I do this?
I thought also about
d=(a>thresh) .float()*b +(a<thresh) .float()*c
but I am not sure the gradient will be computed properly with autograd
Thanks