How to add appropriate noise to a neural network with constant weights so that back propagation training works?

I have a neural network in a synthetic experiment I am doing where scale matters and I do not wish to remove it & where my initial network is initialized with a prior that is non-zero and equal everywhere.

How do I add noise appropriately so that it trains well with the gradient descent rule?

I was thinking of adding the noise from xavier to my constant weight NN. So I create a new NN with xavier init or standard pytorch init and then to all weights add the constant value I need.

Would that work?


cross-posted: