LSTM/GRU gate weights

Hi :slight_smile:

I would like to have a custom weight initialization to each gate of my rnn (GRU and LSTM). How can I get the weights of a specific gate in the GRU/LSTM implementation ?

1 Like
#LSTM
net = nn.LSTM(100, 100) #Assume only one layer
w_ii, w_if, w_ic, w_io = net.weight_ih_l0.chunk(4, 0)
w_hi, w_hf, w_hc, w_ho = net.weight_hh_l0.chunk(4, 0)

#GRU
net = nn.GRU(100,100)
w_ir, w_ii, w_in = net.weight_ih_l0.chunk(3, 0)
w_hr, w_hi, w_hn = net.weight_ih_l0.chunk(3, 0)

reference:

4 Likes

How do I get the biases?

You can get weights of all layers by

rnn = nn.LSTM(100, 100,20)
for name in rnn.named_parameters():
    if 'weight' in name[0]:
        weight_list.append(name[1])

2 Likes

Great answer, thanks!
But I think that for the GRU, the second weight_ih_l0 should be changed to weight_hh_l0

@ Dheeraj_M_Pai: You can get biases by: net.bias_hh_l0, and net.bias_ih_l0. 0 in l0 stands for layer number. If you have more than one lstm layer then change 0 to 1, 2, or the number of whichever layer you want to manipulate.