Hello guys,
I am creating a custom neural network and I want to create a linear combination of two outputs of different layers after applying a corresponding non-linearity. So for example, if x and y are the outputs of two layers and f is an activation function (relu, softpluts etc), I wish to compute:
a*f(x)+(1-a)*f(y)
I have tried many things, like using the expand operator for the scalar with the right dimensions of tensor x and y but I keep getting the error :
" unsupported operand type(s) for *: ‘Tensor’ and ‘Softplus’ "
Any ideas on how to make it work?
Thank you in advance