Convert a variable into numpy then reconvert it back

Hello all! I have a question about the autograd, please look at the following the example.

out = net(input)
out = out.detach().numpy()
new_output = function_1(out) # function_1 contains some numpy operations
new_output = torch.from_numpy(new_output ).float()
new_output = Variable(new_output , requires_grad=True).to(device)
loss = loss_function(new_output, label)
loss.backward

When I call ‘loss.backward’, will the parameters in net also be updated?

Thanks

1 Like

In short, No. The parameters of the model will not receive gradients as you have .detach()ed the computation graph from the loss function.

Hi, do you know how to solve this problem with other operations? Can I perform Numpy operations between the output of a network and loss calculation?

Nope, to my understanding, if you want to backpropagate through them, they have to be torch operations. You should re-implement those functions using torch.

1 Like

Ok. Thank you for your answer.

No, you can’t, but in torch there are lots of similar operations as numpy