How to apply different activation fuctions to different neurons

For a simple MLP neural network, I want to different neurons assign different activation functions, But i don’t know how to do that in pytorch.Dose anyone know about it? thanks for helping me! thanks.

2 Likes

MLP = Linear layer followed by activation, right?
Use slicing to select parts of the Linear output, apply activations, then recombine using torch.cat

linear_out = self.linear(x) # shape is (batches, features)
first_slice = linear_out[:,0:split_point]
second_slice = linear_out[:,split_point:]
tuple_of_activated_parts = (
    F.relu(first_slice),
    F.tanh(second_slice)
)
out = torch.cat(tuple_of_activated_parts, dim=1) # concatenate over feature dimension

But this will not work if the slices are interleaved, e.g. apply activation ReLU to even neurons, sigmoid to odd neurons. I think torch.index_select or torch.gather can be used to extract those indices then and torch.cat can be used to recombine.

thanks for reply, i need unordered replace activation fuction.

thanks for reply, i will try your advise. but i am a new pytorch user, if you can tell me the code, it will be more convinent, anyhow , thanks.