Does initialization give similar result despite different learning rates?

Hi I have included an initialisation part in my code which looks like this:

> for m in self.modules():
             if isinstance(m, (nn.Conv2d, nn.Conv3d)):
                nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu')
             elif isinstance(m, (nn.BatchNorm2d, nn.BatchNorm3d)):
                nn.init.constant_(m.weight, 1)
                nn.init.constant_(m.bias, 0)

And I have tested my training with two different learning rates: (1) 0.0005 and (2) 0.01 but both give quite similar results and the loss is dropping at relatively same speed. I have also fixed the random values using the same seed.

Could it be because of the code I included above or other bugs in my code that cause this to happen?

Thank you

That would wildly depend on the optimizer you’re using and the model you try to optimize.
If your model is well conditioned, the optimizer will converge fast with (almost) any learning rate.

How to determine if a model is well conditioned?