Check if two tensors are connected by graph

I am trying to backpropagate the loss of my model towards the input, for the purpose of calculating adversarial examples. The idea is to train an adversarial sticker that can be added to an input image to cause an object detection system to fail. However, when trying to backpropagate my loss towards the input of my model by doing torch.autograd.grad(loss,sticker), I get the following RuntimeError: One of the differentiated Tensors appears to not have been used in the graph.

My graph looks something like this:

sticker, images -> apply sticker -> object detection -> loss

The loss is defined by how well the object detector succeeds in recognizing the objects.

I suspect I get this error because somewhere along my calculations, i used an operation that caused the input Sticker to get disconnected from the graph. I’m not sure how to determine where that happens. Is there any way to check for this?

did you use .detach() or .data or .numpy() somewhere in the chain? those are the main suspects for a disconnected graph

Thank you for your attention. I used .detach() like you said.