Implementation of complex-valued activation functions

I’m working with complex-valued activation functions and, similarly to discussions at #47052, I am interested in knowing what type of function does the ReLu (and others such as tanh, sigmoid, etc. Complex Activation Functions) function. I looked into the source code available on pytorch and did not find a concrete answer. Any ideas where I can find more information?