I’m trying to apply a softmax to certain indexed of a 1D vector of size (batch_size, 30)
The code is like this:
# vector of size [batch_size, 30]
# area type
a = F.sigmoid(x[:, 0])
# curvature
b = F.softmax(x[:, 1: 4], dim=1)
# facilities for bicycles
c = F.softmax(x[:, 4: 9], dim=1)
# lane width
d = F.softmax(x[:, 9: 12], dim=1)
# median type
e = F.softmax(x[:, 12: 22], dim=1)
# number of lanes
f = F.softmax(x[: , 22: 25], dim=1)
# rest
g = F.sigmoid(x[: , 25:])
print(torch.cat([a, b, c, d, e, f, g], dim=1))
Afterwards I try to concatenate or stack the vectors a to g back to one array, but it shows me this error:
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-56-956fcd57bc35> in <module>()
1 model.cpu()
----> 2 model(x[:5].cpu())
~/anaconda3/envs/pytorch_p36/lib/python3.6/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)
323 for hook in self._forward_pre_hooks.values():
324 hook(self, input)
--> 325 result = self.forward(*input, **kwargs)
326 for hook in self._forward_hooks.values():
327 hook_result = hook(self, input, result)
<ipython-input-55-3e40b2ee85c5> in forward(self, x)
36 g = F.sigmoid(x[: , 25:])
37
---> 38 print(torch.cat([a, b, c, d, e, f, g], dim=1))
39 #return torch.cat((a, b, c, d, e, f, g), dim=1)
40
RuntimeError: dimension out of range (expected to be in range of [-1, 0], but got 1)
Changing the dim
to 0 or -1 doesn’t work. It works if y try the concatenation on a batch of 1.