Tensor outcome after operation still a lead node

I could not get through z.backward() as it showed “RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn”

a0 = torch.FloatTensor([1]).clone().detach()
a0 = a0.to(device)
a0.requires_grad_ = True # create a leaf tensor
z = a0 * 2
opt = torch.optim.SGD([a0], lr = 0.01)
z.backward()
opt.step()

I believe a0 is a leaf node in device and should have computed a0.grad. After rounds of checking I found z was still a leaf node, why was that?

This call is wrong, since .requires_grad_ is a function and not an attribute, so you are overriding it.
Use:

a0.requires_grad_()

and it should work.