Sure!
The first thing running into a size mismatch error is to check your tensor shapes.
In your original code you are using x = x.view(-1, 64)
.
Since the shape of x
is [4, 64, 9, 9]
, and you forced x
to be [-1, 64] = [4*9*9, 64]
, your batch dimension is now larger than it should be.
This yields exactly the error message for a size mismatch in the batch dimension (324 vs. 4).
The right approach is to keep the batch_size
and reshape the feature map into dim1.
Since self.fc1
was too small for 64*9*9
, you had to expand it.