Adding a constant to an activation function

Hello! I would like to define an activation function which is the normal ELU + 1. Based on what I found online I decided to do this:

def SELU(input):
    return nn.ELU(input)+1+1e-15

class SELU(nn.Module):
    def __init__(self):
        super().__init__()

    def forward(self, input):
        return SELU(input)

However when I try to call it, I am getting this error: TypeError: unsupported operand type(s) for +: 'ELU' and 'int'. How should I properly add 1 to the activation function? Thank you!

Hi @smu226,

When you do nn.ELU() you’re initializing a class, which represents the elu function. It, itself, it not a callable function, but its instance is. So, you end up trying to add a ELU class with an int (hence your error message).

The following code below works,

import torch
from torch import nn

class SELU(nn.Module):
    
    def __init__(self):
        super(SELU, self).__init__()

        self.elu = nn.ELU() 

    def forward(self, input):
        return self.elu(input) + 1.
      
      
        
x = torch.randn(1)
selu = SELU()
elu = torch.nn.ELU()

selu_y=selu(x)
elu_y = elu(x)

print(selu_y) #returns tensor([2.0493])
print(elu_y) #returns tensor([1.0493])

1 Like