What is the utility of requires_grad = False

hy every one
i have found a scrips that put the required grad of the 10 first layers to false,
I am trying to understand way do we put requires_grad = False, i know that is to freeze certain layers in the network and to do not calculate gradient in it, but what is the benefit in the calculation


Setting this to False means that no gradient are needed and so all the autograd state does not need to be saved for this part of the net. So it will reduce the memory consumption of the model (and slightly speed it up).

so it is not about avoiding the destraction of any information contained into those layers,
during futur training rounds. because i want to preserve the weights of the first layers of my network, to be used in another case.

Not sure what you mean by “destruction of any information contained into those layers”.
If you reference saved Tensors for the backward pass. They are only useful in that case. Since you don’t need them here, it is irrelevant.
And otherwise, nothing else is removed from the layers.

the informations that i am talking about are the weights values. i want to gard this values for an other processing. the goal is to compare the final result

Yes if these Tensors are not in the optimizer, they won’t get updated.
And if they are and their .grad field is None, they won’t get updated as the backward won’t update the .grad field.

tank you very mutch; you help me a lot