Multiplication of activation function with learnable parameter/scalar

Hello guys,

I am creating a custom neural network and I want to create a linear combination of two outputs of different layers after applying a corresponding non-linearity. So for example, if x and y are the outputs of two layers and f is an activation function (relu, softpluts etc), I wish to compute:
a*f(x)+(1-a)*f(y)
I have tried many things, like using the expand operator for the scalar with the right dimensions of tensor x and y but I keep getting the error :
" unsupported operand type(s) for *: ‘Tensor’ and ‘Softplus’ "

Any ideas on how to make it work?

Thank you in advance

I think you are missing the initialization of the nn.Softplus module and might be running into this error:

torch.randn(1) + nn.Softplus(torch.tensor(1.))
> TypeError: unsupported operand type(s) for +: 'Tensor' and 'Softplus'

You would have to create the module first before applying it onto the tensor or use the functional API:

torch.randn(1) + nn.Softplus()(torch.tensor(1.))
torch.randn(1) + F.softplus(torch.tensor(1.))

That worked. Thank you very much for the answer!