I’m trying to perform gradient backward propagation like this,

```
this_out = model(this_inp)
model.zero_grad()
this_diff = torch.randn(1,1)
this_diff[0] = 1.0
this_diff = this_diff.double()
this_out.backward(this_diff)
for f in model.parameters():
_ = f.data.add_(f.grad.data * learning_rate)
```

and it says “f.grad” data is a NoneType