I have been working with pretrained embeddings(GloVe) and suffered memory increases problem. But when I updated code below, memory increase problems didn’t occur.
What I want to ask is that
-
do we need to update pretrained embedding?
-
Is there any relationship between embedding updates and memory increase problem?
self.embedding = nn.Embedding.from_pretrained(embedding) if embedding is not None: self.embedding.weight = nn.Parameter(embedding) self.variable_lengths= variable_lengths self.embedding.weight.requires_grad = False #memory increases every time when True