What is standard scale of BatchNorm1d?

You are somehow close enough, but let me explain what I am particularly unclear.
The batch normalization applies to a layer that we can represent as a tensor of activations A. Values of A range somewhere between [r1, r2] this means that all the activations are in that interval.

After the batch norm which is just a transformation of the activations A, we will get activations tensor B. What would be the range of the B?