What is the meaning of require_grad?

I am confused about the require_grad.When the require_grad of one tensor or Parameter is false, the layer before it will not backpropogate the grad? Or some other meaning? Thanks too much.

A Tensor has requires_grad=True if gradients for it need to computed during the backward pass. This can be either because this Tensor needs gradients in the case where it’s a leaf Tensor or because some Tensor that were used to compute this one requires gradients and so gradients need to be computed to be passed to the previous layer.