How to change activation of a GRU layer

Hi,
I am trying to use a GRU with two layers using this module as in intermediate block in my model. Something like:

input -> Conv -> FC -> GRU -> FC -> Conv -> output

However, the tanh activation at the output of the GRU sounds to hinder the training. Any idea how I can disable the tanh activation and use a linear one instead?