Alpha ReLU Invertibility?

Hi I am reading paper on reversible residual networks. In that It was mentioned that Leaky ReLU is Invertible function so it is also possible to obtain BatchNorm layer values from the activation functions. I was thingking leaky ReLU and Alpha ReLU are pretty similar and alpha ReLU in some cases gives better results? So I just want to know whether Alpha ReLU an invertible function like Leaky ReLU?