Modifying a Tensor with requires_grad=True in PyTorch - Maintaining Connection for Backpropagation

Hi everyone,

I’m encountering an issue while creating a tensor from an input matrix and performing further analysis on it in PyTorch. My code involves modifying a tensor with requires_grad=True, but I’m running into a RuntimeError about in-place operations.


Code Snippet:

import torch


def zero_dimension_Topology_computation(distance_matrix):
   
  size = distance_matrix.shape[0]

  distance = torch.zeros(int(size*(size-1)*1.5), requires_grad=True)
   
  counter = 0
  for i in range(size):
    for j in range(i+1, size):

      distance[counter] = distance_matrix[i, j]
      counter += 1
      distance[counter] = i
      counter += 1
      distance[counter] = j
      counter += 1
   
  return distance

my_distance_matrix = torch.tensor(
 [[0.,     1.1220787, 2.56805496, 3.17087232, 4.19878769, 5.28180264, 6.00723115, 7.00725978, 8.10627762, 9.00086877],
 [1.1220787, 0.,     1.48800688, 2.52166396, 3.09666062, 4.57047671, 5.06417233, 6.00301049, 7.04549262, 8.02508567],
 [2.56805496, 1.48800688, 0.,     2.82092185, 2.02773478, 4.46949128, 4.43069515, 5.16422506, 6.00762258, 7.21203433],
 [3.17087232, 2.52166396, 2.82092185, 0.,     2.51119582, 2.11094118, 3.08805856, 4.22031097, 5.51843328, 6.06739184],
 [4.19878769, 3.09666062, 2.02773478, 2.51119582, 0.,     3.14220102, 2.54342334, 3.14917369, 4.00012553, 5.19276009],
 [5.28180264, 4.57047671, 4.46949128, 2.11094118, 3.14220102, 0., 1.72659306, 2.84336824, 4.25008607, 4.29969124],
 [6.00723115, 5.06417233, 4.43069515, 3.08805856, 2.54342334, 1.72659306, 0.,     1.17321974, 2.56312249, 3.00479065],
 [7.00725978, 6.00301049, 5.16422506, 4.22031097, 3.14917369, 2.84336824, 1.17321974, 0.,     1.40677017, 2.04867891],
 [8.10627762, 7.04549262, 6.00762258, 5.51843328, 4.00012553, 4.25008607, 2.56312249, 1.40677017, 0.,     1.747742 ],
 [9.00086877, 8.02508567, 7.21203433, 6.06739184, 5.19276009, 4.29969124, 3.00479065, 2.04867891, 1.747742,  0.    ]]
, requires_grad=True)


zero_dimension_Topology_computation(my_distance_matrix)

Error:

RuntimeError: a view of a leaf Variable that requires grad is being used in an in-place operation.

Problem:

I want to create a tensor distance from distance_matrix while keeping requires_grad=True for backpropagation. However, directly assigning values to distance using in-place operations throws an error.

Desired Outcome:

  1. Create a tensor distance from distance_matrix.
  2. Modify distance for further analysis.
  3. Maintain the connection to the computation graph for backpropagation.

Question:

How can I effectively modify a tensor with requires_grad=True while preserving its connection to the computation graph for backpropagation?

Hi Yasin!

When you create distance you are creating a new tensor that has
requires_grad set to True. This makes it a so-called “leaf Variable
that requires grad.” You are not allowed to modify such tensors in place
(as doing so messes up how autograd keeps track of what it is supposed
to be computing gradients with respect to), but assigning into distance
with, say, distance[counter] = distance_matrix[i, j] does count
as modifying it in place. Hence the error.

Because my_distance_matrix, the tensor you pass into your function
as its distance_matrix argument, carries requires_grad = True, the
results of subsequent computations will also carry requires_grad = True.

So do not create distance with requires_grad = True – it’s not needed.

When you first create distance (without requires_grad = True), it will
be a so-called leaf tensor, but will not have requires_grad = True. So
you are allowed to modify it in place.

With distance[counter] = distance_matrix[i, j], you do modify it in
place, and do so with the requires_grad = True tensor distance_matrix.
This operation does preserve the connection to the computation graph and
fully supports backpropagation.

After this operation, distance does carry requires_grad = True, but
because it is no longer a leaf tensor you are allowed to further modify it
in place, such as with distance[counter] = i.

If you pass into zero_dimension_Topology_computation() an argument
that has requires_grad = True, the tensor it outputs will as well, and you
will have a proper computation graph and will be able to backpropagate
through the call to zero_dimension_Topology_computation().

Best.

K. Frank

1 Like

Thank you so much.
Best regards
Yasin K