Differentiated input is unreachable in double backward

I’m getting the error differentiated input is unreachable on torch.autograd.grad in the following code. I’m using PyTorch 0.3.0, and everything runs fine up until the second to last line. Any help would be appreciated!

# forward
output = net(data)

# backward
loss = F.cross_entropy(output, target, reduce=False)
weights_normalized = weights / (weights.sum() + 1e-10)
loss = weights_normalized * loss
loss = loss.sum()
optimizer.zero_grad()
loss.backward(create_graph=True)
optimizer.step()

# forward val
output = net(data_val)

# backward val
loss = F.cross_entropy(output, target_val)
weights_grad = torch.autograd.grad(loss, weights, only_inputs=True)
weights = weights + 1e-2 * weights_grad