Transfer learning, requires_grad

By setting “requires_grad == True” for a layer in a pretrained model. Does the layer become initilized randomly? Or does it get initilized with the pretrained parameters and then retrained during training?

Hi, I think initialisation is not dependent on what you set for requires_grad==True. If you don’t load the pretrained checkpoint or weights, the model will be initialised with random weights (Check this documentation - Initializing pre-trained models).
Setting requires_grad as True will ensure the gradient calculation for whatever weight initialisation you do (random or loaded pretrained) for the future operation on them.