Long indexing with further indexing on Tensor gives a copy

I see that when we index a Tensor with another LongTensor and afterwards normally index it via [:] a copy is created.

For example

a = torch.Tensor([10, 20, 30, 40, 50])
ind = torch.LongTensor([1,3])

a[ind][:] = 0
print(a)
# prints [10, 20, 30, 40, 50]

a[ind] = 0
print(a)
# prints [10, 0, 30, 0, 50]

No I am wondering how I can do the same without getting a copy of the Tensor so that I am able to overwrite the indexed values?
E.g. if I want to set the first two values of an indexed tensor to zero

a[ind][:2] = 0

How would I do this so the actual Tensor is eventually overwritten?

the last statement is same as:
setitem(getitem(a,ind), slice(0,2), 0)

so it is staged with early gathering, and thus not composable as inplace operation

but you can do things like: a[ind[:2]] = 0

Thank you very much. That was exactly what I was looking for!