Expanding Variable with zero-padding (for DepthConcat)

Hello, I’m a newbie of PyTorch. :slight_smile:
I would like to concat two variables of different sizes, for DepthConcat of inception network.

For example,

# a: 1x2x2, b: 1x4x4
a = Variable(torch.FloatTensor([[[1,2],[3,4]]]))
b = Variable(torch.FloatTensor([[[1,1,1,1],[2,2,2,2],[3,3,3,3],[4,4,4,4]]]))

What I want to get from cat([a, b], 0) is

(0 ,.,.) =
0 0 0 0
0 1 2 0
0 3 4 0
0 0 0 0

(1 ,.,.) =
1 1 1 1
2 2 2 2
3 3 3 3
4 4 4 4

I may try to use narrow() and copy_(), but I’m not sure copying to Variable in forward() is ok.

Oh i’ll check if ConstantPad2d can solve this.
Thank you

Constant pad should do it. There’s no way to copy one variable into another one (if you take out .data and work with that, these ops won’t be registered by autograd).

Thank you!
What I looked for was constant padding. I used torch.nn.functional.pad(), which is not in API Document :slight_smile: