I don’t think there is any generic way of testing it, since it depends on your init_method.
What you can do it looping over the parameters and checking some stats:
for name, p in net.named_parameters():
check_method(name, p)
This check method could be looking at e.g. p.mean() or p.std() and you could draw a conclusion from it.
nn.init functions only run simple in-place init operations (without autograd) on the Tensor that represent the params. As you can see here (called there).
These operations don’t have any side effects except the change of values in the param Tensor.