Why sigmoid is class not tanh

Hi,

I am trying to understand why sigmoid is a class in pytorch and not tanh.
Also can we directly use torch.tanh as an activation function or there is some better implementation of tanh as activation function in pytorch.

Hi,

There is no reason AFAIK.
Both the class and function version use the exact same implementation and the two exist just to make writing NN nicer.
If you write a custom forward function for your Module, you can use the function version. If you use constructs like Sequential, you will have to use the class version.

Hello Granth -

I’m not quite sure what you’re asking, but just to be clear, pytorch does
have both a class and a function version of Tanh:

torch.nn.Tanh

torch.nn.functional.tanh

(Pytorch also has both a class and function version of Sigmoid.)

As an aside, note that pytorch’s sigmoid() is the so-called logistic
function,
and that is essentially the same function as tanh().

Best.

K. Frank

1 Like

Hi,

Are both torch.tanh and NN.tanh same?
Which is better to be used as an activation function

Hello Granth -

I haven’t checked the code, but I am quite certain that what Alban
said is correct: Not only do:

torch.tanh (some_tensor)
torch.nn.functional.tanh (some_tensor)
torch.nn.Tanh() (some_tensor)

all perform the same computation (and return the same result), but
they all ultimately resolve to the same implementation to do so.

The first two versions in my above code are functions. I believe that
the second is being deprecated in favor of the first (for reasons that
I don’t understand).

The third version is a class whose instances are function objects.

(Note, there is no torch.nn.tanh. There is a torch.nn.Tanh (a
class) and a torch.nn.functional.tanh (a function) (and the
function torch.tanh).)

If you like using functions, use torch.tanh (some_tensor),
although you can instantiate the function-object version on the fly
and then call it: torch.nn.Tanh() (some_tensor). (The on-the-fly
function-object approach has the de minimus inefficiency of
instantiating a new object every time the Tanh()() call is made.)

If you want or need to have an instance of a function object, for
example in order to include it as a “layer” in a Sequential, then
you will need to use torch.nn.Tanh.

But again, in terms of the function that gets computed (rather than
the packaging), they are all the same.

Best.

K. Frank

1 Like