Hi, I found a way as follows (but Iām not sure is it correct or not):
a = nn.GRU(500, 50, num_layers=2)
from torch.nn import init
for layer_p in a._all_weights:
for p in layer_p:
if 'weight' in p:
# print(p, a.__getattr__(p))
init.normal(a.__getattr__(p), 0.0, 0.02)
# print(p, a.__getattr__(p))
This snippet of the code could initialize the weights of all layers.
Hope this could help you