I defined a standard ResNet50 and I have built a custom module (inheriting nn.Module) that modifies the weights of the convolutional layers. I have the following code which successfully changes the old modules to the new ones:
def modify_model(model, args):
for m in model._modules:
child = model._modules[m]
if is_leaf(child):
if isinstance(child, nn.Linear):
model._modules[m] = modifyLinear(child, args)
del(child)
elif isinstance(child, nn.Conv1d):
model._modules[m] = modifyConv1d(child, args)
del(child)
else:
modify_model(child, args) # Used to convert submodules
return model
And here is a stripped down version of the new module:
class modifying_module(nn.Module):
def __init__(self, module):
super(modifying_module, self).__init__()
bias = True if module.bias is not None else False
self.conv = nn.Conv1d(module.in_channels, module.out_channels, kernel_size=module.kernel_size,
stride=module.stride, padding=module.padding, dilation=module.dilation,
bias=bias, groups=1)
def forward(self, x):
masked_weight = self.conv.weight * self.mask
x = F.conv1d(x, masked_weight, self.conv.bias, self.conv.stride,
self.conv.padding, self.conv.dilation, 1)
return x
@property
def mask(self):
return torch.ones(self.conv.weight.size())
As you can see, the “modifying_modules” for the convolutional modules defines a new nn.Conv1d according to the information from the base module. Alternatively, I have also used the following with no improvement:
self.conv = module.deepcopy()
The problem is, the error keeps saying that ‘“modifiying_module” has no attribute ‘weight’’ and marks the masked_weight line as the error. I have even tried copying the weight and registering it as a parameter explicitly. Then it says that there is no ‘bias’. So I tried registering every parameter and buffer for the convolution manually and it still gives me errors.
I don’t understand why it expects the “modifiying_module” to have an attribute ‘weight’ that is separate from the convolution’s weight when I am explicitly calling the weight from self.conv.
Any and all feedback would be greatly appreciated.