How to modify activation in GRU

We know in keras, Bidirectional(GRU(128, activation='linear', return_sequences=True))(a1) # (240,256),that is to say, we can choose activation.But in torch,there’s no para to choose.nn.GRU(n_in, n_hidden, bidirectional=True, dropout=droupout, batch_first=True, num_layers=num_layers)
I want to know how to modify activation in GRU in torch