Implementing sech

I’m trying to implement sech function, but Pytorch doesn’t have an internal implmentation for it as tanh does. I came up with two solutions:

import torch as tc
import  math
tanh_y = tc.tanh(input)
sech_y = tc.sqrt(1-a.pow(2))
#if it's in autograd.Function, I can then use the tanh_y and sech_y for backward.
a, b = tc.pow(math.e, input), tc.pow(math.e, -input)
#a = input.exp()
#b = a.reciprocal()
sech_y = 2/(a+b)
#tanh_y = (a-b)/(a+b)

I’m just curious about which one is better(perhaps faster), or does it not quite make any difference.
Thanks for your time.


Given all the formulas I found online, out = 1/torch.cosh(z) should be the best (less total number of operations).

thx. I didn’t know there was a cosh function in Pytorch :frowning:

1 Like