When to use repeat instead of expanse?

Hello,
When should we use repeat instead of expanse?
Apart from the case where we want to call an inplace function?

expand does not allocate new memories, which means if you do

x = torch.tensor([[1], [2], [3]])
expand_x = x.expand([3,4])
x[0, 0] = 4

you will get expand_x

tensor([[4, 4, 4, 4],
        [2, 2, 2, 2],
        [3, 3, 3, 3]])

If you use repeat_x = x.repeat(1, 4), changing x will not affect repeat_x.

1 Like