thnkim
(Pete Tae-hoon Kim)
March 11, 2017, 2:14am
1
Hello, I’m a newbie of PyTorch.
I would like to concat two variables of different sizes, for DepthConcat of inception network.
For example,
# a: 1x2x2, b: 1x4x4
a = Variable(torch.FloatTensor([[[1,2],[3,4]]]))
b = Variable(torch.FloatTensor([[[1,1,1,1],[2,2,2,2],[3,3,3,3],[4,4,4,4]]]))
What I want to get from cat([a, b], 0) is
(0 ,.,.) =
0 0 0 0
0 1 2 0
0 3 4 0
0 0 0 0
(1 ,.,.) =
1 1 1 1
2 2 2 2
3 3 3 3
4 4 4 4
I may try to use narrow() and copy_(), but I’m not sure copying to Variable in forward() is ok.
thnkim
(Pete Tae-hoon Kim)
March 11, 2017, 5:33am
2
Oh i’ll check if ConstantPad2d can solve this.
Thank you
apaszke
(Adam Paszke)
March 11, 2017, 10:28am
3
Constant pad should do it. There’s no way to copy one variable into another one (if you take out .data
and work with that, these ops won’t be registered by autograd).
thnkim
(Pete Tae-hoon Kim)
March 11, 2017, 3:31pm
4
Thank you!
What I looked for was constant padding. I used torch.nn.functional.pad(), which is not in API Document