Related to pertrained models

Can we train one dataset for some time and then freeze the layers and then use another dataset and freeze other layers in xlnet pertained model?

Is it possible to freeze layers for pertained models?

Yes, you can freeze the parameters of a specific layer by setting their .requires_grad attribute to False:

model = ... # model initialization
# freeze parameters of my_layer
for param in model.my_layer.parameters():
    param.requires_grad_(False)

@ptrblck thank you, I will try

@ptrblck
I want to load dataset 1 with freeze 6-11 layers and load other dataset2 with freeze 0-5 layers my code for one dataset1:
for epoch in range(EPOCHS):

       print(f'Epoch {epoch + 1}/{EPOCHS}')
       print('-' * 10)
       if epoch==0:
          for name, param in list(xlnet_model.named_parameters())[104:206]: 
                    print('I will be frozen: {}'.format(name))
                    #print(param)
                    param.requires_grad = False
       

       train_acc, train_loss = training(xlnet_model,train_data_loader,optimizer, device, scheduler, len(train_data))

       print(f'Train loss {train_loss} Train accuracy {train_acc}')

will this statement "for name, param in list(xlnet_model.named_parameters())[104:206]: " freeze only 6-11 layer and othe weight will update?( i am using pertrained model)

I don’t know, if these indices would correspond to layers 6 to 11 and you would have to verify it.
Maybe a safer approach would be to create a list with the layer names and to compare the submodule and/or the parameter names with this list. If it’s matching a name in this list, you could freeze it.

@ptrblck , Thank you, its working