How to cap a torch tensor so that it doesn't exceed a value through backpropagations?

As title says. For example if my tensor represents a probability, how do I, in a simple way, ensure it’s always in the range of 0-1 through backpropagations?(not by implementing situation-specific tricks like logit transfer)