Is there an easy way to remove the activation on the output in nn.RNN()? It says it takes by default tanh() and can be changed to relu(), but is there any way to have none?
Michael
Is there an easy way to remove the activation on the output in nn.RNN()? It says it takes by default tanh() and can be changed to relu(), but is there any way to have none?
Michael