I’m not deeply familiar with reading the Keras output but are you using two Dense layers at the end?
The softmax (Dense) output shape is given as (None, 186, 12) which doesn’t fit your nn.Linear output. If that’s the case, I’m unsure why a shape mismatch error isn’t raised when e.g. calculating the loss.
Thanks for the reply. I changed my module to look like this. The reshape at the end is to make sure it predicts 12 classes. I think the issue might be in training. Any idea if I am using the cat layer correctly and the reshape for crossentropy?
also do you happen to know if keras equivalent for LSTM(memory_units, return_sequences=True, stateful=False, name='lstm')
would be nn.LSTM(16, 64,batch_first = True)
and the use the output
I would recommend to check the shape of X before the reshape operation, since flattening the data into the batch dimension is usually wrong as done via X.reshape(-1,12,186).
If you are not careful, you could change the batch size of X which would then create shape mismatches e.g when trying to calculate the loss.
Thanks without the reshape my size is torch.Size([32, 186, 12]) if I pass this to loss (Crossentropy) it gives me an error. Basically, the # of classes is 12 ? if I follow the Crossentropy doc I need to change it (N,C,d1)
Yes, your interpretation of the expected input shape is correct.
However, the reshape operation is wrong as it would interleave the values.
Since you want to swap some dimensions, use X = X.permute(0, 2, 1).contiguous() instead.