I noticed that the default initialization method for Conv and Linear layers in Pytorch is Kaimiing_uniform.

I just don’t understand why the default value of negative_slope(the default act is leaky_relu) is √5.

Is it written just for simplicity or for some specific reason?

```
def reset_parameters(self):
init.kaiming_uniform_(self.weight, a=math.sqrt(5))
if self.bias is not None:
fan_in, _ = init._calculate_fan_in_and_fan_out(self.weight)
bound = 1 / math.sqrt(fan_in)
init.uniform_(self.bias, -bound, bound)
```