I am almost new to PyTorch. I would like to implement LeakyRELU in Pytorch from scratch, writing it as a class with the forward method. Could someone help me with writing? Should I write the backward method too as PyTorch already supports for auto differentiation?
Hi,
In pytorch you don’t need to have a class, your leaky relu can be a simple python function that given an input returns the output after the leaky relu.
If you want to use it in nn.Sequential or similar elements, you can wrap that function in an nn.Module where you need to define the __init__
function if needed and the forward
function that will correspond to applying the leaky relu to an input.