However when I try to call it, I am getting this error: TypeError: unsupported operand type(s) for +: 'ELU' and 'int'. How should I properly add 1 to the activation function? Thank you!
When you do nn.ELU() you’re initializing a class, which represents the elu function. It, itself, it not a callable function, but its instance is. So, you end up trying to add a ELU class with an int (hence your error message).
The following code below works,
import torch
from torch import nn
class SELU(nn.Module):
def __init__(self):
super(SELU, self).__init__()
self.elu = nn.ELU()
def forward(self, input):
return self.elu(input) + 1.
x = torch.randn(1)
selu = SELU()
elu = torch.nn.ELU()
selu_y=selu(x)
elu_y = elu(x)
print(selu_y) #returns tensor([2.0493])
print(elu_y) #returns tensor([1.0493])