Customize an activation function

nn.RNN supports only tanh or relu for nonlinearity [code]. One easy way to solve your problem is to write your own loop over the sequence just like in this example.

2 Likes