What's the default initialization methods for layers?

So Pytorch uses He when its ReLU? Im confused what pytorch does.