I cannot get my model's weights to match the input type

Hi everyone,

I am working on a Neuroevolution project. I am building a model which is composed by several DenseBlocks, however, the skip connections of each of the denseblocks may vary. I tell you all of this to put into context, that the forward pass is built dynamically.

My code is the following:

'''Networks class'''
class CNN(nn.Module):
  def __init__(self, e, denseBlocks, links, classifier, init_weights = True):
    super(CNN, self).__init__()
    extraction = []
    for block in denseBlocks:
      for layer in block:
        extraction += layer
    self.extraction = nn.Sequential(*extraction)
    self.classifier = nn.Sequential(*classifier)
    self.denseBlocks = denseBlocks
    self.links = links
    self.connections = e.second_level
    self.first_level = e.first_level
    self.nblocks = e.n_block
    
    
  def forward(self, x):
    '''Feature extraction'''
    for i in range(self.nblocks):
        block = self.denseBlocks[i]
        connections = self.connections[i]
        link = self.links[i]
        prev = -1
        pos = 0
        outputs = []
        for j in range(self.first_level[i]['nconv']):
          if j == 0 or j == 1:
            x = nn.Sequential(*block[j])(x)
            outputs.append(x)
          else:
            conn = connections[pos:pos+prev]
            for c in range(len(conn)):
              if conn[c] == 1:
                x2 = outputs[c]
                x = torch.cat((x, x2), axis = 1)
            x = nn.Sequential(*block[j])(x)
            outputs.append(x)
            pos += prev
          prev += 1
        x = nn.Sequential(*link)(x)

    x = torch.flatten(x,1)
    '''Classification'''
    x = self.classifier(x)
    return nn.functional.log_softmax(x, dim=1)

Inside the __init__, I add the convolutional part (i.e. all the denseblocks and transition operations) as self.extraction and the classification part as self.classification. In the forward pass, I use the information inside self.denseBlock to dynamically pass the input x through the convolutional layers inside each block, until the end of the CNN.

The problem is that I want to pass a random tensor just to see if the network computes correctly. I have:

#Create network
net = CNN(e, network[0], network[1], network[2])
net.to(device, dtype = torch.float32)

#Create random tensor in cuda
image = torch.rand([1, 1, 256, 256], device = device)
net(image)

I what I get is:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-83-0987cf7a2c44> in <module>()
----> 1 net(image)

6 frames
/usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py in _conv_forward(self, input, weight)
    418                             _pair(0), self.dilation, self.groups)
    419         return F.conv2d(input, weight, self.bias, self.stride,
--> 420                         self.padding, self.dilation, self.groups)
    421 
    422     def forward(self, input: Tensor) -> Tensor:

RuntimeError: Input type (torch.cuda.FloatTensor) and weight type (torch.FloatTensor) should be the same

I have tried many combinations, like the .cuda(), not using the dtype = torch.float32 but I have no idea how to solve it. I have read that possibly not all the parts of the network are on the GPU. But I have defined two sequentials already, and the modules I use in the forward pass are the same that are inside the sequentials.

Thanks in advance for your help!

Could you check, if all passed arguments are nn.Module objects, i.e. denseBlocks, links, and classifier?
It’s unclear from the error message which module is raising this issue and the code looks generally alright.
Wrapping the modules in nn.Sequential inside the forward is unusual, but shouldn’t change the device.

Feel free to add the missing module definitions in case you get stuck, so that we can help debugging. :wink:

1 Like

Thank you very much! You were right! The elements inside the links list were not added correctly. I used the nn.ModuleList instead of a normal list and the problem finally solved! :raised_hands:

1 Like