How to use external libraries like OpenCV, NumPy, SciPy, etc in middle of forward/backward phase

Hi, I wonder if there is way to use external libraries in middle of forward and backward phase.

For example, is following way possible?

optimizer.zero_grad()
prediction=Net(input)
clone_prediction=prediction.clone()
clone_prediction=clone_prediction.detach().cpu().numpy().transpose(0,2,3,1)

loss_outside_of_pt=opencv_function(clone_prediction)
loss_outside_of_pt=numpy_function(loss_outside_of_pt)

loss_v=torch.Tensor([loss_outside_of_pt])

loss_v.backward()
optimizer.step()

I actually tried above way and I got error of

RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

which probably means there is no backward graph connection so that it can’t perform backward

So, I considered to manually connect backward graph like this

optimizer.zero_grad()
prediction=Net(input)
clone_prediction=prediction.clone()
clone_prediction=clone_prediction.detach().cpu().numpy().transpose(0,2,3,1)

loss_outside_of_pt=opencv_function(clone_prediction)
loss_outside_of_pt=numpy_function(loss_outside_of_pt)

# I tried to use small_v to connect forward/backward graph
small_v=torch.sqrt(torch.sum(prediction))*0.0001
loss_v=torch.Tensor([loss_outside_of_pt])
summed_loss_v=small_v+loss_v

summed_loss_v.backward()
optimizer.step()

I confirmed this can perform forward and backward but loss decrease looks not correct
(actually loss had increased)

And following way seems working
Is this correct way? No problem for updating network?

optimizer.zero_grad()
prediction=Net(input)

# I calculate loss in pytorch variables
loss_v_from_pt=pytorch_loss_func(prediction,gt)

clone_prediction=prediction.clone()
clone_prediction=clone_prediction.detach().cpu().numpy().transpose(0,2,3,1)

# I calculate loss outside of pytorch variables
loss_outside_of_pt=opencv_function(clone_prediction)
loss_outside_of_pt=numpy_function(loss_outside_of_pt)

# Loss calculated outside of pytorch variable
loss_v=torch.Tensor([loss_outside_of_pt])
# loss_v_from_pt is calculated in pytorch variable 
# so that forward/bachward graph is still connected
summed_loss_v=loss_v_from_pt+loss_v
summed_loss_v.backward()
optimizer.step()

I wonder if there is way that I can use external libraries in middle of forward and backward.

Otherwise, if all above ways trying to clone and detach pytorch variable to use external libraries are impossible, does it mean I only should use pytorch functions in the middle of forward/backward phase?

Thanks.

1 Like

It is not possible to use numpy/opencv operations between forward()/backward() computations unless you implement those operations yourself using Torch Tensors (in a differentiable manner).

when you used .detach(), you are already disconnecting the graph and the further calculation with numpy/openCV has no effect on the weights of the Net.
Essentially, the loss in your code ( summed_loss_v=loss_v_from_pt+loss_v ) is interpreted as summed_loss_v=loss_v_from_pt+CONSTANT, hence it has no effect on optimization.

Thanks. It will be great if I could use external libraries in the middle of forward/backward
because there are many situations that I need to use complicated operations.
Thanks for helpful answer.