A small question about inplace error

class Drop_attr3(nn.Module):
    def __init__(self):
        super(Drop_attr3, self).__init__()
    def forward(self,x,adj):
        row = x.size()[0]
        i = random.randint(0, row-1)
        x = make_zero_row(x,i)
        return x ,adj
def make_zero_row(x,row):
    torch.zero_(x[row])
    return x

here is my model.when I try to train it , the error is reported as “one of the variables needed for gradient computation has been modified by an inplace operation”, i want to know how to set one row of the tensor to ‘0’ without having inplace problems .
I’d appreciate some help!

i think it’s working now. you only need to add x = x + 0
and I’ve changed some lines to add support for batched data.

import torch
import torch.nn as nn
import random

class Drop_attr3(nn.Module):
    def __init__(self):
        super(Drop_attr3, self).__init__()
    def forward(self,x,adj):
        x = x + 0
        row = x.size()[1]
        i = random.randint(0, row-1)
        x[:,i].fill_(0)
        return x ,adj

# test the code
total = 256 * 56 * 4 
x = torch.arange(0, total).view(256,56 , 4).float()
x.requires_grad = True
m = Drop_attr3()
o= m(x,1)
o[0].mean().backward()

Thanks alot! It does work!