Creat a self -defined activation function


(Bin Zhou) #1

I want to create an activation function, y=x^2, and call it in my network , how to realize it?


(Arulkumar) #2

y=torch.pow(x,2) ?


(Bin Zhou) #4

how to make it as activation function


(Arulkumar) #5

For ReLU activation, we do this:

out = some_convolution(input)
out = nn.functional.relu(out)  # relu activation

In the same way, you can realize your quadratic activation as,

out = some_convolution(input)
out = torch.pow(out, 2)  # quadratic activation