I want to use a custom activation function that has a random component that gets applied to every neuron individually.
If I use the standard method and call the activation function on a layer, it applies the same value to every neuron in that layer.
I am looking for the most efficient way to have the activation function affect every neuron individually and would appreciate any advise on the topic.
The only method I currently know about involves splitting the input tensor and applying the function each part and then concatenating the result, but I am hoping there is a more efficient way
If you want a modified mish, then I would implement the function using just regular element-wise functions. And then you create a Tensor with the coefficient you want and can add/multiply it at any place your need in the computation.
Where does it happen?
Note that I assume in the code above the x is 1D. You might need to update that.
For example, if x is 2D:
If you want beta to be the same for all elements for a given sample, you can simply add a beta = beta.unsqueeze(-1) to add a new dimension of size 1 in beta and the broadcasting logic will take care to expand it.
If you want a different beta for every element of every sample, then just update the size given to torch.empty() to reflect the full size of x.
Hi , Im also trying to apply a different activation function per neuron on a simple MLP. So can you please share how you managed to separate tensors from a linear layer to apply different activation Fs. I want my model to have seven nodes and apply different activation functions per node (first layer). I tried the select_index and the other technique mentioned but i cant do it. Im getting errors like this " ```
The size of tensor a (64) must match the size of tensor b (500) at non-singleton dimension 1
"
Maybe the problems sterms from the fact that i dont how a tensor representing the layer looks like and im simply indexing it in a similar way one would index a list. Also refer me to where i can find more info about this. Thanx . sorry for the long paragraph.