Make a line in embeddings layer requires_grad = False

Hi
Is there a way to set requires_grad = False for only one line in embeddings layer?

For example I want to pad every sequence to 20 words and I don’t want to change line of the pad-word from zeros.

Packed Sequence won’t work here because I won’t use any rnn layer.

Padding will always be all zeros.

1 Like

why so?
And the second thing about making them non-grad is that I don’t want to spend time calculating gradient for them

This is how the Embedding function is designed. In Pytorch, the gradient wrt the padding_idx will always be zero.

1 Like