Loss.requires_grad = True in validation?

Hi community,

I have a question regarding validation loss. I have train, validation, and test in my algorithms.

I put train and validation in one train function. When in train, I got model.train(). When I validation, I got model.eval().

However, I am not sure how should I deal with loss.rquires_grad and loss.backard in model.eval(). Should I remain loss.requires_grad = True and loss.backward() in validation?

            loss.requires_grad = True

No, you should not train the model during the validation phase and usually you would wrap it in a with torch.no_grad() block to disable Autograd and to save memory by avoiding to store intermediate forward activations needed for the gradient computation.