Gradient update query

Hello

I am trying to update gradeint. Would the following script update the gradient

for grad, param in zip(ss, dnn_Linear.parameters()):
pp=param*Total_loss_model
param=pp
Does the varaiable param, updates the gradient? If yes, do i just need to use optimizer.step() ?

Thanks