Tensor turns to None type in forward of VGG convnet

Hello, I am met with the following error in my ConvNet implementation.
The error comes up when the trying to execute the self.blocks line in the forward method of the ConvNet.
After investigation I realized my tensor would actually turn to None type after going through self.input_net in ConvNet forward.
Also if helpful, I added the architecture of the model I am trying to implement.
VGG Architecture Implemented

TypeError: conv2d() received an invalid combination of arguments - got (NoneType, Parameter, Parameter, tuple, tuple, tuple, int), but expected one of:
 * (Tensor input, Tensor weight, Tensor bias, tuple of ints stride, tuple of ints padding, tuple of ints dilation, int groups)
      didn't match because some of the arguments have invalid types: (!NoneType!, !Parameter!, !Parameter!, !tuple!, !tuple!, !tuple!, int)
 * (Tensor input, Tensor weight, Tensor bias, tuple of ints stride, str padding, tuple of ints dilation, int groups)
      didn't match because some of the arguments have invalid types: (!NoneType!, !Parameter!, !Parameter!, !tuple!, !tuple!, !tuple!, int)
class PreActResnetBlock(nn.Module):
  def __init__(self, c_in, c_out, kernel=3, stride=1, padding=1):
    """
    Inputs:
          c_in: number of input feeatures
          c_out: numberof output features
          kernel: convolution kernel size
          stride: convolution stride
          padding: convolution padding
    """  
    super().__init__()
    self.net = nn.Sequential(
                  nn.BatchNorm2d(c_in),
                  nn.ReLU(),
                  nn.Conv2d(c_in, c_out, kernel_size=kernel, padding=padding, stride=stride, bias=False)            
    )

  def forward(self, x):
    out = self.net(x)
    out += x


class VGGBlock(nn.Module):
  
  def __init__(self, c_in, c_out, last_block=False):
    """
    Inputs:
        last_block: if True add a convolution at the beggining of the block
        c_in: number of input features to the convolution
        c_out: number of output features to  the convolution
    """

    super().__init__()
    layers = []
    if not last_block:
      layers.append(nn.Conv2d(c_in, c_out, kernel_size=1, padding=0, stride=1))
    
    layers.extend([nn.MaxPool2d(kernel_size=3, stride=2, padding=1),
                    PreActResnetBlock(c_out, c_out),
                    PreActResnetBlock(c_out, c_out)])

    self.net = nn.Sequential(*layers)

  def forward(self, x):
    x = self.net(x)
    return x


class ConvNet(nn.Module):
    """
    This class implements a Convolutional Neural Network in PyTorch.
    It handles the different layers and parameters of the model.
    Once initialized an ConvNet object can perform forward.
    """
    
    def __init__(self, n_channels, n_classes):
        """
        Initializes ConvNet object.
        
        Args:
          n_channels: number of input channels
          n_classes: number of classes of the classification problem
        """
        super().__init__()

        self.hparams = SimpleNamespace(n_channels = n_channels,
                                        n_classes = n_classes)
        
        self._create_network()
        #self._init_params()

    def _create_network(self):
      hidden_dims = [64, 128, 256, 512]

      # Stemm to scale up the channel size
      c_out = hidden_dims[0]
      self.input_net = nn.Sequential(
                          nn.Conv2d(self.hparams.n_channels, c_out, kernel_size=3, padding=1, bias=False),
                          PreActResnetBlock(c_out, c_out)
      )


      #VGGBlocks
      blocks = []      
      for i in range(4):
        if i == 3:
          c_in = c_out = hidden_dims[-1]
          blocks.append(VGGBlock(c_in, c_out, last_block=True)) 
          break
        
        c_in = hidden_dims[i]
        c_out = hidden_dims[i+1]
        blocks.append(VGGBlock(c_in, c_out))
      self.blocks = nn.Sequential(*blocks)
      

      # Mapping for classification head to target
      self.output_net = nn.Sequential(
                            nn.MaxPool2d(kernel_size=3, stride=2, padding=1),
                            nn.Linear(c_out, self.hparams.n_classes)
      )
    
    def forward(self, x):
        """
        Performs forward pass of the input. Here an input tensor x is transformed through
        several layer transformations.
        
        Args:
          x: input to the network
        Returns:
          out: outputs of the network

        """ 
        ############
        # My input x turns to None type after going through self.input_net for some reason
        # and I can't figure out why.
        x = self.input_net(x)

        x = self.blocks(x)
        out = self.output_net(x)

        return out

That’s great debugging!
The reason for this is that you’ve forgotten to return out in the forward method of PreActResnetBlock. Adding it should solve the issue.

1 Like

Thank you very much! Boggles me when I don’t notice small bugs like this.