jack-kjlee
(Kyeong Jun Lee)
1
def forward(self, input):
input,_ = input.sort(dim=1, descending=True)
out = nn.functional.conv2d(input, self.weight, None, self.stride, self.padding, self.dilation, self.groups)
return out
My concern is if above code of reordering input can affect to the calculation of the autograd gradients or are safe.
Thanks in advance!
colesbury
(Sam Gross)
2
The returned values from sort()
are differentiable if that is what you are asking. For
input2, indices = input1.sort(dim=1, descending=True)
The gradient of input1
will be the re-ordered gradient of input2
. For example,
input1 = torch.tensor([1., 3., 2.], requires_grad=True)
input2, _ = input1.sort(descending=True) # 3, 2, 1
input2.backward(torch.tensor([0.3, -0.2, 0.01]))
print(input1.grad) # 0.01, 0.3, -0.2
2 Likes