Freezing only one layer of a lstm

I have a pretrained model with this layer:

self.rnn1 = nn.LSTM(..., num_layers=3, ....)

Previously, I have frozen all the model…

How can I freeze only the last layer of the LSTM in a clear way?

In other words, Is it possible to convert a lstm with 3 layers pretrained in three layers of LSTMs (1 layer each one) in a easy way?

rnn1 = nn.LSTM(input_size=10, hidden_size=5, num_layers=3)
relevant_parameters = [i for i,(param_name,param_value) in enumerate(list(rnn1.named_parameters())) if 'l2' in param_name]
for i,cur_parameter in enumerate(rnn1.parameters()):
    if i in relevant_parameters:
        print("Setting for {0}".format(i))
        cur_parameter.requires_grad=False