How to initialize weights/bias of RNN LSTM GRU?

Hi, I found a way as follows (but Iā€™m not sure is it correct or not):

a = nn.GRU(500, 50, num_layers=2)

from torch.nn import init
for layer_p in a._all_weights:
    for p in layer_p:
        if 'weight' in p:
            # print(p, a.__getattr__(p))
            init.normal(a.__getattr__(p), 0.0, 0.02)
            # print(p, a.__getattr__(p))

This snippet of the code could initialize the weights of all layers.
Hope this could help you :slight_smile:

3 Likes