Forcing output of torch.nn.linear layer between 0 to 1

How to force output of torch.nn.linear() layer between 0 to 1 without any normalization (i.e. sigmoid)

I have 3 output nodes of the linear layer it can have negative value or values are quite different but if I apply any normalizing function specifically sigmoid it forces the output value for all three nodes to be between 0.30 - 0.35 (what i observed)

any suggestions will be highly helpful

If you network maps those values to 0.30-0.35 it means it doesn’t work. The network should slowly converge to binary values if it is designed to do so.
Be aware it reaches the optimal value.
You can just clamp th values so that values between 0-1 remains the same after the transform. You can rescale the whole interval into 0-1.