RuntimeError: mat1 and mat2 shapes cannot be multiplied (64x48 and 3072x100)

I’m trying to input a 5D tensor with shape ( 1, 8, 32, 32, 32 ) to a VAE I wrote:

self.encoder = nn.Sequential(
        nn.Conv3d( 8, 16, 4, 2, 1 ), # 32 -> 16
        nn.BatchNorm3d( 16 ), 
        nn.LeakyReLU( 0.2 ),
        
        nn.Conv3d( 16, 32, 4, 2, 1 ), # 16 -> 8
        nn.BatchNorm3d( 32 ),
        nn.LeakyReLU( 0.2 ),
        
        nn.Conv3d( 32, 48, 4, 2, 1 ), # 16 -> 4
        nn.BatchNorm3d( 48 ),
        nn.LeakyReLU( 0.2 ), 
    )
    
    self.fc_mu = nn.Linear( 3072, 100 ) # 48*4*4*4 = 3072
    self.fc_logvar = nn.Linear( 3072, 100 )
    
    self.decoder = nn.Sequential(
        nn.Linear( 100, 3072 ),
        nn.Unflatten( 1, ( 48, 4, 4 )),
        nn.ConvTranspose3d( 48, 32, 4, 2, 1 ), # 4 -> 8
        nn.BatchNorm3d( 32 ),
        nn.Tanh(),
        
        nn.ConvTranspose3d( 32, 16, 4, 2, 1 ), # 8 -> 16
        nn.BatchNorm3d( 16 ),
        nn.Tanh(),
        
        nn.ConvTranspose3d( 16, 8, 4, 2, 1 ), # 16 -> 32
        nn.BatchNorm3d( 8 ),
        nn.Tanh(), 
    )

def encode( self, x ) :
    x = self.encoder( x )
    x = x.view( -1, x.size( 1 ))
    
    mu = self.fc_mu( x )
    logvar = self.fc_logvar( x )
    
    return self.reparametrize( mu, logvar ), mu, logvar 
    
def decode( self, x ):
    return self.decoder( x )
    
def forward( self, data ):
    z, mu, logvar = self.encode( data )
    return self.decode( z ), mu, logvar 

The error I’m getting is: RuntimeError: mat1 and mat2 shapes cannot be multiplied (64x48 and 3072x100). I thought I had calculated the output dimensions from each layer correctly, but I must have made a mistake, but I’m not sure where. Thank you :slight_smile:

Based on the error message I would assume that the error is either raised in self.fc_mu or self.fc_logvar, since both are using the mentioned 3072 in_features.
You could add debug print statements to the forward method of your model and check the shape of the input activations to these layers, which should show the other mentioned shape.

I’ve edited the post to include the forward() code. I ended up fixing it (I think) by changing
x = x.view( -1, x.size( 1 )) to x = x.view(x.size( 0 ), -1). Thank you :slight_smile: