I’m trying to implement sech function, but Pytorch doesn’t have an internal implmentation for it as tanh does. I came up with two solutions:
import torch as tc import math
tanh_y = tc.tanh(input) sech_y = tc.sqrt(1-a.pow(2)) #if it's in autograd.Function, I can then use the tanh_y and sech_y for backward.
a, b = tc.pow(math.e, input), tc.pow(math.e, -input) #or: #a = input.exp() #b = a.reciprocal() sech_y = 2/(a+b) #tanh_y = (a-b)/(a+b)
I’m just curious about which one is better(perhaps faster), or does it not quite make any difference.
Thanks for your time.