I have the following in the forward method:

```
def forward(self, x, member_id=None):
model_output = F.softmax(self.models(x), dim=1)
model_output[:, 2: 4] = (torch.min(torch.cat((model_output[:, :2], model_output[:, 5:]), dim=1), dim=1)[0].repeat(2,1).permute(1,0)
return model_output
```

When I apply `loss.backward()`

, the following error pops up:

`RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [1, 10]], which is output 0 of SoftmaxBackward, is at version 1; expected version 0 instead. Hint: the backtrace further above shows the operation that failed to compute its gradient. The variable in question was changed in there or anywhere later. Good luck!`

**There is no error without F.softmax.**