About softmax function

I learned some feature maps from network, and I fed these feature maps to softmax for normalizing values to [0,1], but I found that all resultant values are small number, that maximum value is only 0.1, I want to perform min-max normalization for the resultant features. That is to say that let the maximum of the resultant feature map as 1 and minimum as 0. But I am not sure if I such min-max normalization will broke the gradient backword, because it calculate maximum and minimum value, which is not differential. Could anybody give me some suggestion?