1D convolutions are NCL
.
Up to my knowledge there is no they around than using permute
and transpose
operations. Though, if you swap just two dimensions, transpose
is more readable. Also, I suggest writing one such operation per line, as in my code:
def forward(self, x):
embedded = self.embed(x)
embedded = embedded.transpose(0, 1)
outputs, (h, c) = self.lstm(embedded)
h = h.transpose(0, 1).contiguous()
h = h.view(h.size(0), -1)
res = self.fc(h)
return res
See more: Inconsistent dimension ordering for 1D networks - NCL vs NLC vs LNC