How to get the difference to a target in PyTorch?

I apologize if I am using the correct terminology here, so feel free to correct me if there’s a better way to say what I am trying to do.

I want to create a new variable dG, which is equal to the difference to the target. I would like to do this with two variables, self.target and self.G, where self.target is my target.

Both self.target and self.G are:torch.Size([512,512])

In Lua/Torch, this accomplished with:

dG = self.crit:backward(self.G, self.target)

But I can’t seem to figure out how to do this in PyTorch.

Would this work for what I am trying to do?

self.G2 = Variable((self.G.clone().data),requires_grad=True)
self.G2.backward(self.target,retain_graph=True)
dG = self.G2.grad

What you’re asking for is not the difference, its the gradient? To get the difference, just do self.target - self.G.