I’m trying modify a tensor in forward block to copy one value to another for a tensor in forward pass for a neural network.
Something like
def forward(self, X):
x = self.fc(X)
x = self.modify_tensor(x)
return self.fc2(X)
def modify_tensor(self, X):
idx = random.sample(range(X.shape[0), k=1)
X[idx] = X[idx+1]
return X
First I’m not sure if it’s correct since I would want the gradients to be calculated according to the new value. Secondly, this does work (not sure if correctly but it runs) but the backward pass for such an operation takes forever. I’m not sure what is happening. is there a better way?