Backprop through scatter_add_

Hi, I was wondering if there would be any issues with backprop while using scatter_add_, since it is an in-place operation. Specifically, in the code below:

x = conv(x)
p = x.new_zeros((n,c,h))
p.scatter_add_(2, idx, x)

Will gradients flow correctly to the weights of conv, if the loss is calculated using p?


Yes the autograd works fine with inplace operations. If it cannot deal with things, it will raise an error.