Complex valued neural network

Hello,

ReLU is in fact a function based on comparing a real number with 0, and as there is no natural order for complex numbers, doing the max between a complex number and zero is not well defined, and so is ReLU for complex numbers.

Depending on what problematic you are working, I think you can define your own activations or use activations that are adapted to complex numbers (this includes all holomorphic functions)

1 Like