Can I set different loss (e.g. NLLLoss in RNN for language generation) in each training iteration?

I actually have 2 questions here:

  1. I would love to ignore different indices for different examples. I am trying to do that for language generation, in which for different input sentences I want to not ignore just a small set of words specific to each input sentence. Usually in training a neural network, you set criterion = nn.NLLLoss( ) before you start training. In the above described case, can I, in each training iteration, set the loss differently for each training example (i.e. each input sentence)? I guess this is not possible for minibatch training - but can I do this for SGD?

  2. I am not sure what happens if you set ignore_index to some target indices you want to ignore? The doc says setting ignore_index means those indices do not contribute to the gradient calculation. Does that mean the derivative of the NLLLoss is normalized by the sum of the exponentials corresponding to the NON-ignored indices instead of the sum of exponentials of ALL the indices?

Thanks in advance! I hope I have articulated my questions, especially the first one, well enough :slight_smile: