Manual Change to Gradient Doesn't Affect Performance?

Hi. I am trying the gradient mannual to explore how the modified gradient changes the ultimate performance.

I am wondering why it doesn’t show any difference in terms of performance it is multipled by 0.000001?

        for param in model_head.parameters():
       *= 0.000001


Would it be possible if it is due to the Adam optimizer?

Thank you.

Best Regards