How can i apply Lecun weight initialization for my Linear Layer?

Im using SELU as my activation function. As i have known pytorch is using He as weight initialization for Linear Layer. How can i apply Lecun weight initialization for my Linear Layer?

You can apply the torch.nn.init method (or any other custom weight initialization) on the modules directly or e.g. via model.apply() and a weight_init method as described in this post.

2 Likes

Thank you very much!

Hoping someone might be able to elaborate a little further on this as I’m trying to initialize a Conv2d layer’s weights with Lecun Normal (for selu), but torch.nn.init doesn’t seem to have a lecun_normal_() function. Is there a recommended way of achieving this?