Jit enabling requires grad

using traced model after jit compilation to do forward pass, torch.clamp() seems to introduce requires_grad to be True to a forward. In normal mode, it is fine but when I use the traced model this happens. To confirm torch.clamp() is introducing this I print x.requires_grad before and after the clamp operation.
This bug seems weird. Any fix?

Do you have a minimal code snippet, as I cannot reproduce this issue?