My model looks as below:
class ERModeler(nn.Module):
def __init__(self, vocab_size, embedding_dim, weInput):
super(ERModeler, self).__init__()
self.embeddings = nn.Embedding.from_pretrained(weInput)
self.embeddings.weight.requires_grad = False
self.linear = nn.Linear(1, 2)
When I check the the parameters, it looks surprising to me that the first layer of parameter is the weights of embedding. I expect it’s there only when self.embeddings.weight.requires_grad = True
model = ERModeler(VOC_SIZE, EMBEDDING_DIM, weInput)
print(list(model.parameters())[0])