TF equivalent of truncated_normal_initializer

How to have the TensorFlow equivalent of


I guess you can use normal initialization and then truncate it with torch.clamp().

This would not be equivalent to tf.initializers.truncated_normal, and probably not what you want.

  • In tf, samples outside [mean-2*std,mean+2*std] are discarded and re-drawn
  • With torch.clamp(), the samples outside [min,max] are converted to min or max

If you think about the proba density function, using torch.clamp() would create 2 spikes at min and max.

Which to me leaves the initial question open.

1 Like