How to do repeat operation like numpy.repeat in pytorch on Variable like tensor?

Assume that I have got a tensor like below:

var = Variable(torch.randn(1,7,7))

I would like to know, how can I copy this variable and make a new variable with size 512*7*7. Actually the dimension 0 is the number of copy from my original tensor var. This is a line of code which I would like to use it in the forward function of my customized new layer. Like other layers which are differentialbe, I would like to benefit autograd for doing backprop on it. So unpacking this variable and converting it to numpy tensor and doing this operation in numpy area is not correct according to the philosophy of Autograd.

Are there any good approaches for doing above operation?

1 Like

http://pytorch.org/docs/search.html?q=expand&check_keywords=yes&area=default

Is this what you are looking for?

You may want to copy() it after that if you actually need to change 7x7 panes independently.

1 Like

@Veril, Thanks for your great response! I appreciate your favor.
The solution is:

new_var = var.expand(512, 7, 7)

2 Likes

You should also look into torch.repeat. That’s also similar and useful in some cases!

When should you use repeat vs expand?

‘repeat’ copies the original data and allocates new memory and ‘expand’ creates a new view on the existing tensor.