How does requires_grad compare to set_grad_enabled?

In this transfer learning tutorial https://pytorch.org/tutorials/beginner/transfer_learning_tutorial.html, first, requires_grad is set to False then in the training loop, while training we’re calling set_grad_enabled, I am getting confused, new to PyTorch, pls explain what’s going on here?

Hi,

The requires_grad is a Tensor property that tells us if it requires gradients or not.
THe set_grad_enabled is a global flag that allows you to disable the autograd locally.