Can we detach the tensor to manipulate and again cast it to tensor with requires grad = True and backpropagate on it?

I am working on a code that does some averaging on the detached NumPy array of probabilities of the PyTorch tensor, can I recreate a tensor from the average NumPy array with required_grad=True and backpropagate?

I’m pretty sure once the graph has been freed it’s gone. Try and see if you can find the equivalent operations in PyTorch or you could define a custom torch.autograd.Function object which would allow you to backpropagate through it.

Could you share a minimal reproducible example?

1 Like

Thanks for the reply.
Sure, below is what operations it is doing,
compute_batch_predictions() detached the tensor to the NumPy array, which is used to create a data frame that is doing averaging.

def train()
{
...
    batch_predictions = compute_batch_predictions(output, mode=parameters["model_mode"])
    pred_df = pd.DataFrame({k: v[:, 1] for k, v in batch_predictions.items()})
    pred_df.columns.names = ["label", "view_angle"]
    predictions = pred_df.T.reset_index().groupby("label").mean().T[LABELS.LIST].values
    l = loss(predictions, gt)
    optimizer.zero_grad()
    l.backward()
    optimizer.step()
...
}

As @AlphaBetaGamma96 explained, once you’ve detached the computation graph you won’t be able to backpropagate through it anymore. Based on your code it seems you are trying to calculate the mean of some stats, so try to use PyTorch methods instead.

2 Likes