Inplace error when gradient is required

Following the official document of residual network, I write the following network class, however, when I run the code, I got the error

"one of the variables needed for gradient computation has been modified by an inplace operation "

‘’’’

class ResBlock(nn.Module):
    def __init__(self, num_node, num_fc, activate = nn.ReLU(inplace=True)):
    super(ResBlock, self).__init__()
    self.act = activate
    self.linears_list = [nn.Linear(num_node, num_node) for i in range(num_fc)]
    self.acti_list = [self.act for i in range(num_fc)]
    self.block = nn.Sequential(*[item for pair in zip(self.linears_list, self.acti_list) for item in pair])
    
    'Xavier Normal Initialization'        
    for m in self.block:
        if isinstance(m, nn.Linear):
            nn.init.xavier_normal_(m.weight.data,  gain=1.0)

def forward(self, x):
    residual = x
    out = self.block(x)
    out += residual #### this part gives the error
    out = self.act(out)
    return out

‘’’’

However, if I change out += residual into out = out + residual, then everything is perfect.

But the “vision/resnet.py at master · pytorch/vision · GitHub” do the same thing.

I dont understand why its not working in my case.

This depends on whether the thing you modify inplace is needed for the gradient (of the function that returned it - as here - or when you used it as input before modifying it). You have linear last which needs its output to do backward while resnet has batch norm last.

Best regards

Thomas