Clip gradients norm in libtorch

Hi,

Is there any API to clip the gradients of a network? Or, I need to develop myself?

Best,
Afshin

Hello @afshin67,

yes there is, here.

1 Like

Thanks for the link,

Hi @afshin67 , I used the function “clamp” to clip gradients, but I don’t know if it was correct:
for(int i=0; i<net.parameters().size(); i++)
{
net.parameters().at(i).grad() = torch::clamp(net.parameters().at(i).grad(), -GRADIENT_CLIP, GRADIENT_CLIP);
}
optimizer.step();

Is there something that I ignored? And what could I do?

Best,
June

@ZhuXingJune
I did not use clamp and wrote a piece of code for myself. But, you can check whether it works or not by calculating the norm of the gradient before and after calling that code:

float modelImpl::get_grad_norm(int grad_norm_type) {
    torch::Tensor tmp = torch::zeros({1});

    for (auto &p : layers->named_parameters()) {
        auto z = p.value(); // note that z is a Tensor, same as &p : layers->parameters
        tmp += torch::pow(torch::norm(z.grad(), grad_norm_type), grad_norm_type);
    }

    auto total_norm = torch::pow(tmp, 1 / (grad_norm_type + 1e-6));
    return *total_norm.to(cpu_device).data<float>();
}