I want to create an activation function, y=x^2, and call it in my network , how to realize it?
y=torch.pow(x,2)
?
how to make it as activation function
For ReLU activation, we do this:
out = some_convolution(input)
out = nn.functional.relu(out) # relu activation
In the same way, you can realize your quadratic activation as,
out = some_convolution(input)
out = torch.pow(out, 2) # quadratic activation