Hi,
It does support autograd, but can compute gradients only wrt the input tensor and not the indices (as the gradients wrt the indices does not exist).
This code snippet should make it clear:
import torch
from torch.autograd import Variable
inp = Variable(torch.zeros(10), requires_grad=True)
# You need to set requires_grad=False because scatter does not give gradient wrt to indices
indices = Variable(torch.Tensor([2, 5]).long(), requires_grad=False)
# We need this otherwise we would modify a leaf Variable inplace
inp_clone = inp.clone()
inp_clone.scatter_(0, indices, 1)
inp_clone.sum().backward()
# So the values that are not modified by scatter have a 1 gradient
# The values changed by scatter have a 0 gradient as they were overwritten by the scatter
print(inp.grad)