Help with understanding if images are getting normlaized correctly and its effect on the network performance

Hello, I am working with the BreakHIs dataset ( Breast Histopathology images), and on computing the mean and standard deviation, I get the following values: mean (0.7941, 0.6329, 0.7694), std=(0.1024, 0.1363, 0.0908). However, when using these values for normalization, I noticed that my normalized image’s min and max values are around -8 and +3, respectively. Are these values correct? How does the incorrect image normalization affect the network performance? Can it cause vanishing gradient issues?

The following is the histogram for one of the original images; it can be seen that the histogram is skewed towards the brighter end.