Tensor[1] get the original tensor and tensor[[1]] gets a copy, is this expected?

Hi,
quick surprising question, I always learn something new :slight_smile:

w=torch.rand(10)
w[[4,5]].zero_()
print(w)
w[4:5].zero_()
print(w)
w:
tensor([0.8315, 0.6297, 0.2248, 0.7077, 0.9125, 0.7869, 0.5703, 0.2823, 0.0974, 0.5943])
w[[4,5]].zero_()
tensor([0.8315, 0.6297, 0.2248, 0.7077, 0.9125, 0.7869, 0.5703, 0.2823, 0.0974, 0.5943])
w[4:5].zero_()
tensor([0.8315, 0.6297, 0.2248, 0.7077, 0.0000, 0.7869, 0.5703, 0.2823, 0.0974, 0.5943])

Seems when masking it gets a copy and when slicing it takes the subtensor. Is this expected?