Optimizer learning rate

Hi,all
If I set learning rate in optimizer, param_group is not working ?

optimizer = optim.Adam(~~~, lr=0.01)

lr = args.lr * (0.1 ** (epoch // 30))
    for param_group in optimizer.param_groups:
        param_group['lr'] = lr

It’s working on my machine:

optimizer = optim.Adam([torch.randn(1, requires_grad=True)], lr=1e-3)
for param_group in optimizer.param_groups:
    param_group['lr'] = 1.

What kind of error do you get?

Thanks for reply.
There is no error but I’m just wondering param_groups are work!