b = a[:, 0:2], I guess slice of a will copy to tensor b
a[: 0:2] = a[:, 2:4], but how this work behind?
nope, b and a share memory, though b is a new python wrapper object with different strides
here python’s object.__setitem__
mechanism is used instead, making partial assignment with copying possible
tkx,and how does torch tensor implement this a[: 0:2] = a[:, 2:4]?
Should be the same as a[:,0:2].copy_(a[:,2:4]), i.e. strided mem. copy
1 Like
Indeed, in particular, the RHS is created and then copied over to the slice.
1 Like
tkx, so in a[:,0:2].copy_(a[:,2:4])
, a[:,0:2]
is just a reference not a clone?
That’s correct. ~~~~~~~~