If parameters not calc grad, will weight_decay effect to the parameters?

Hi:
a = model()
a.detach_()
c = a+b
l = calcloss©
optimizer = optim.Adam(c.parameters(), lr=0.01, weight_decay=1e-5)
l.backward()
optimizer.step()

Does parameters in module “a” change because of weight_decay?