Do we need to update pretrained embedding?

I have been working with pretrained embeddings(GloVe) and suffered memory increases problem. But when I updated code below, memory increase problems didn’t occur.

What I want to ask is that

  1. do we need to update pretrained embedding?

  2. Is there any relationship between embedding updates and memory increase problem?

     self.embedding = nn.Embedding.from_pretrained(embedding)
     if embedding is not None:
         self.embedding.weight = nn.Parameter(embedding)
     self.variable_lengths= variable_lengths
     self.embedding.weight.requires_grad = False  #memory increases every time when True

If you want to update the embedding layer, the gradients would need to be stored and would thus use memory.
It depends on your use case, if you need/want to fine tune these embeddings or use the pretrained ones.