How can i apply Lecun weight initialization for my Linear Layer?

You can apply the torch.nn.init method (or any other custom weight initialization) on the modules directly or e.g. via model.apply() and a weight_init method as described in this post.

1 Like