.clamp function has gradient clip effect?

which use case it better? I have to use softmax because I need numbers between 0 and 1 for attention mechanism.

F.log_softmax(ambiguity, dim=1).exp().clamp(min=0.01, max=0.99) or
F.softmax(ambiguity.clamp(min=-8, max=8), dim=1)