Thanks.I have a question.

I’m writing a loss function.

```
def forward(self, input, target):
y = one_hot(target, input.size(-1))
Psoft = torch.nn.functional.softmax(input).cpu()
Loss=0
t1=target.view(1,target.size(0)).cpu()
for i in range(0,target.size(0)-1):
t2=t1[0,i]
flag=int(t2.item())
for j in range(1,flag+2):
P1=Psoft[i,:j]
y1=y[i,:j]
Loss=(P1-y1).sum().pow(2).sum()
#Loss+=(sum(P1-y1))**2
if int(t2.item())!=7:
for k in range(flag+1,9):
P2=Psoft[i,flag+1:8]
y2=y[i,flag+1:8]
Loss=(P2-y2).sum().pow(2).sum()
#Loss+=(sum(P2-y2))**2
Loss=Loss/target.size(0)
print(Loss.grad)
return Loss
```

it’s written on Pytorch 0.4

target is a tensor of 64*1 Psoft is a tensor of 64*8 i found that Loss.grad is none how to get grads? Thanks a lot!