Calc grad of tensor that isnt leaf tensor

I’m using the following code in order to get the gradients of the tensor calculated. I want to change the tensor values in aspect to the loss:

> from facenet_pytorch import InceptionResnetV1
> import torch.nn as nn
> resnet = InceptionResnetV1(pretrained='vggface2').eval()
> tensor1 = torch.tensor((), dtype=torch.float32)
> tensor2 = torch.tensor((), dtype=torch.float32)
> tensor1 = torch.stack([(tensor1.new_ones(3,160,160))])
> tensor2 = torch.stack([(tensor2.new_ones(3,160,160))])
> tensor1.require_grad=True
> ev1 = resnet(tensor1)
> ev2 = resnet(tensor2)
> print(ev1.shape) # torch.Size([1, 512])
> cos = nn.CosineSimilarity()
> loss = cos(ev1, ev2)
> loss.backward(retain_graph=True)
> data_grad =
> print(data_grad)

It seems that loss.backwards doesnt calc anything because in the next line

( grad is None : 
     15 loss.backward(retain_graph=True)
---> 16 data_grad =

AttributeError: 'NoneType' object has no attribute 'data'

I also got a warning that I try to access .grad attribute of a tensor that isnt a leaf tensor. So how can I calc the grad of the orig tensor ?

The error message should also mention, that you can use retain_grad() on the non-leaf variable to get the gradient.

ev1 = resnet(tensor1)

should therefore work.

Unrelated to this issue, but you should use .requires_grad (with an s) and you should not use the .data attribute as it might yield unwanted side effects.

Thanks ! I think that in my case, I wanted to retain_grad of the orig tensor(tensor1) and not of the embedded vector (ev1). It should work the same right ?

It depends, which tensor should get the gradients.
If you want to get gradients for tensor1, you should use

tensor1.requires_grad = True

Thank you very much !