nn.ReLU inpalce operation error

I want to pass some params to nn.Relu without setting the inplace=True. But still got error RuntimeError: a leaf Variable that requires grad is being used in an in-place operation.
here is my code, just want to figure out if pytorch will set inplace=True in train mode automatically?

self.p6_w1 = nn.Parameter(torch.ones(2, dtype=torch.float32), requires_grad=True)
        self.p6_w1_relu = nn.ReLU()
  p6_w1 = self.p6_w1_relu(self.p6_w1)

No, the inplace argument won’t be changed and will keep the value you’ve set during the instantiation.
If you are not using inplace ReLUs, some other operation might manipulate the data inplace, such as assignments using indexing etc.