In keras I have an input of (None, 100, 192)
which Is put through layers.LSTM(64)
which outputs a shape of (None, 64)
In pytorch I have an input of [64, 192, 100]) and then put it through nn.LSTM(100, 64) to get torch.Size([64, 192, 64])
What do I need to do instead with the LSTM layer in pytorch so that it outputs
(batch_size, 64) which is equivalent to the (None, 64) in keras rather than [64, 192, 64]