Torch.tanh vs torch.nn.functional.tanh

A little confusing that both exist. Is any of these deprecated?

2 Likes

we should now deprecate torch.nn.functional.tanh on master, as tensors and variables are now merged.

7 Likes

If I use nn.Tanh I have to declare it in my_model.init(), e.g.

self.tanh = nn.Tanh()

Whereas I can use nn.functional.tanh directly in my_model.forward(), e.g.

output = nn.functional.tanh(input)

If you deprecate nn.functional.tanh I could do

output = nn.Tanh()(input)

where ever I need the functional form, but it would be slower because of the class instantiation.

What has this to do with Variables and Tensors? I am confused.

2 Likes

Or you could still use torch.tanh():

output = input.tanh()
5 Likes

I am expecting output having both positive and negative values. So,

output = torch.tanh(model(input))
in the final output of network should be fine?

Or are any other variants available?

There are only few activations like torch.tanh in torch.
But for torch.nn.functional , i found alot of functional activations.
Why is that?

Same confusion here :thinking: