How to normalize a float number between [0,1] range?

I am working on a discriminator which goal is to say if a video that it gets in input is human made or not.
Therefore the output is a value included between 0(false) and 1(true) for each frame; then the final result is going to be the sum of each frame evaluation.
How do I apply the normalization?

And I also have another doubt: should I normalize every frame result and then normalize the sum too or leave the frame results as they are and normalize only the sum?