I really don’t know why we need detach and detach_. Sometimes I should just use it, for example the target of smooth_l1_loss must be detached. what’s the difference between required_grad, detach, detach_? Can you give me some info?
any help? please! I really don’t understand…
detach_() is the in-place version of
required_grad make the tensor requires the grad, like
a.requires_grad = True.