How to compute the gradient of a vector during test process?

Hi everyone, I want to use the gradient of a vector during test or validation epochs. Would anyone describe me by an example of how to compute the gradient of a vector in test mode?


Could you explain a bit, what “gradient of a vector” means?
If you would like to calculate the gradients of all parameters (as done during training), you could just reuse the training loop and use the validation or test dataset.
Note that you should not update the model, as this would of course be a data leakage.

I am interested in computing the gradient of all parameters. Do you mean, I can use “loss.backward” in test to compute the gradient of parameters but I should not use “optimizer.step()” to update the model?

Yes, loss.backward() using the test data would compute the gradients in the same way as during training.
optimizer.step() would then update all parameters with their gradients, which would create a data leakage, since you would be training on the test dataset.

1 Like

Thank you :slight_smile: