Change Tensor in forward pass

I’m trying modify a tensor in forward block to copy one value to another for a tensor in forward pass for a neural network.
Something like

def forward(self, X):
        x = self.fc(X)
        x = self.modify_tensor(x)
        return self.fc2(X)

def modify_tensor(self, X):
        idx = random.sample(range(X.shape[0), k=1)
        X[idx] = X[idx+1]
        return X

First I’m not sure if it’s correct since I would want the gradients to be calculated according to the new value. Secondly, this does work (not sure if correctly but it runs) but the backward pass for such an operation takes forever. I’m not sure what is happening. is there a better way?

For the correctness, yes it will work. Keep in mind that the values that you override will get 0 gradients ars they are not used anymore.

For the speed, it should not change much.
Can you give sample Tensors (generated with torch.rand) for example to make a runnable example that shows the slow down?