stefdoerr
(Stefan Doerr)
April 24, 2018, 3:11pm
1
Hey, I am trying to sum values in a Tensor based on the index of the output array.
Here is an example
import torch
from torch.autograd import Variable
import numpy as np
data = Variable(torch.from_numpy(np.array([3, 2, 1, 8, 7], dtype=np.float32)), requires_grad=True)
idx = Variable(torch.from_numpy(np.array([0, 0, 1, 1, 2], dtype=np.int64)), requires_grad=False)
So I would like to obtain a Variable
with values [5, 9, 7]
which I can backpropagate through.
I tried doing it with
out = Variable(torch.zeros(3), requires_grad=True)
out.scatter_add_(0, idx, data)
but this won’t work as I am doing an in-place operation on a leaf Variable
.
What is the correct way of doing this?
albanD
(Alban D)
April 24, 2018, 3:13pm
2
Hi,
A simple fix is just to clone
out before performing the scatter on it.
1 Like
stefdoerr
(Stefan Doerr)
April 24, 2018, 3:13pm
3
Ok that worked! But I don’t get the logic behind it.
albanD
(Alban D)
April 24, 2018, 3:16pm
4
A “leaf Variable” is a Variable created by the user for which you want the gradients.
The thing is that the original value could be needed by other stuff when performing the backward pass without pytorch knowing about it and so some computed gradients would be wrong.
1 Like
stefdoerr
(Stefan Doerr)
April 24, 2018, 3:22pm
6
Nevermind. This works
out = Variable(torch.zeros(3), requires_grad=True).clone()
out = out.scatter_add_(0, idx, data)
out = out.scatter_add_(0, idx, data)
albanD
(Alban D)
April 24, 2018, 3:25pm
7
Also you can use the non-inplace version:
out3 = out.scatter_add(0, idx, data).scatter_add(0, idx2, data2)
stefdoerr
(Stefan Doerr)
April 24, 2018, 3:28pm
8
Oops, sorry I deleted my previous post and cannot restore it. Thanks in any case!
Hi,
If I want to use some part of values in out
by using scatter, for example, I would choose 1,2,3 index of out
, is it possible to use backpropagate?
Thank you in advance
albanD
(Alban D)
May 29, 2019, 3:11pm
10
The scatter/gather ops are differentiable, so it will work