What's the difference between detach, detach_, and required_grad

Hi:
I really don’t know why we need detach and detach_. Sometimes I should just use it, for example the target of smooth_l1_loss must be detached. what’s the difference between required_grad, detach, detach_? Can you give me some info?

any help? please! I really don’t understand…

detach_() is the in-place version of detach(), required_grad make the tensor requires the grad, like a.requires_grad = True.