How can i apply Lecun weight initialization for my Linear Layer?

Im using SELU as my activation function. As i have known pytorch is using He as weight initialization for Linear Layer. How can i apply Lecun weight initialization for my Linear Layer?

You can apply the torch.nn.init method (or any other custom weight initialization) on the modules directly or e.g. via model.apply() and a weight_init method as described in this post.

1 Like

Thank you very much!

Hoping someone might be able to elaborate a little further on this as I’m trying to initialize a Conv2d layer’s weights with Lecun Normal (for selu), but torch.nn.init doesn’t seem to have a lecun_normal_() function. Is there a recommended way of achieving this?

def lecun_normal_(tensor: torch.Tensor) -> torch.Tensor:
    input_size = tensor.shape[-1] # Assuming that the weights' input dimension is the last.
    std = math.sqrt(1/input_size)
    with torch.no_grad():
        return tensor.normal_(-std,std)