The following code snippet produces the RuntimeError “one of the variables needed for gradient computation has been modified by an inplace operation”
for i in range(n): arr[i] = torch.cross(vec[i], vec[i+1]) arr[i] = arr[i] / torch.norm(arr[i])
Whereas if I refactor it to the following:
for i in range(n): tmp = torch.cross(vec[i], vec[i+1]) arr[i] = tmp / torch.norm(tmp)
It runs without any issue. My question is, is the former example an inplace operation? Does reassigning an array not create a new variable?