How to expand tensor without sharing storage?

is a Variable and I have the following operation:
a_rep = a.expand(5, a.size(0), a.size(1))
for iter in range(5):
a_rep[iter] = a_rep[iter]*weights[iter]

Then I got the following error:
RuntimeError: in-place operations can be only used on variables that don’t share storage with any other variables, but detected that there are 2 objects sharing it

I guess this is because the storage of each slice in a_rep is shared, but how could I expand a tensor without sharing storage?

I think torch.repeat is what you’re looking for here.
torch.expand will share storage but torch.repeat won’t. You might have to unsqueeze dimension 0 before you use torch.repeat.