Not able to backpropagate because of tensor.data

Hi team,
Please find below the code snippet.I want to backpropagate till input_1 but facing an issue :-
list_1=[]
for i in range(10):

  • batch = [list of frames ]
  • input_1 = torch.cat(batch,0)
  • output = model(input);
  • list_1.append(output.data)

The output of a model is a tensor with required grad = True when I append output in a list(list_1) the code crashes after some iterations, while if I append output. data the code runs fine and requires_grad become false and backpropagation won’t work. What is the correct way to solve this problem?
Thanks in advance.

Hi,

When you use .data, you break the differentiable link between the element you add in the list and the net.
If you don’t use it, I guess you add more and more to the list and you eventually run out of memory because of all the states you keep around.

If you do want to be able to compute gradients here, you should not use .data (in general, you should never use .data actually :smiley: ). But you will have to be careful and make sure the list does not get too big to limit the memory requirements.

1 Like