Questions about xavier weight

Hi all,
I am wondering that my xavier initialization usage is correct or not
and also confused adjusting learning rate.
in optimizer(SGD), I write lr=0.001 then this LR will applied ?

anyone can check my code plz?

model = model.cuda()
for m in encoder.modules():
    if isinstance(m, (nn.Conv2d, nn.Linear)):
    nn.init.xavier_uniform(m.weight)

optimizier = optim.SGD(list(model.paramters()), lr =0.001, momentum=0.9)
lr = 0.001
for param_group in optimizer.param_groups:
    param_group['lr'] = lr
 optimizer.zero_grad()
optimizer.step()