How to initialize the neural network as what I want?

Maybe a little stupid… Suppose that I have a neural network and the last layer is sigmoid. Pytorch will give the 0.5 as the initial value. So how can I initialize the output to 0.1? Is it correct to just subtract 0.4 at the forward process? Will that have a negative effect on the following training? Thanks!

The initial output depends on the input and parameter stats.
Usually you won’t get a perfect 0.5, but the mean value of your output might be approx. 0.5.
You could try to change the initialization of (some) layers and try to push the initial output downwards.

Thanks for your reply! But that is still hard to get the exact value(0.1) I want right? And if I just want the output to have the mean of 0.1, is it correct to just subtract 0.4 in the forward process? Because that’s much easier.

Subtracting a constant from your input won’t help much, as e.g. a relu might kill it after the first layer.
Maybe you could change the bias of the last linear layer to create a specific offset, but I doubt you will be able to create exactly 0.1 without zeroing the weights.

May I ask what your use case is, that you need a specific initial value?

I’m doing the reinforcement learning. The output of the agent is the probability. And the reward of the environment can only have a obvious change when the output is under 0.2.