Gradient checkpointing in validation and testing

Hi,
I’m using gradient checkpoints to save memory training a model with Pytorch Geometric. There is no problem during the training procedure. However, during the evaluation time (validation and testing), I get the following error:

UserWarning: None of the inputs have requires_grad=True. Gradients will be None

I understand that I get this because during evaluation I do not compute gradients, using @torch.no_grad(). I tried to mark the inputs of the model as if they would require gradients, with input.requires_grad_(), but the warning message still appears.

Is there a problem with that message error? Could it affect the validation results? ¿Or it just would affect the gradients, which I’m not computing during evaluation?

Thanks!

I think you could ignore the warning, since (as you’ve already said) you are not interested in calculating the gradients during validation/testing.

1 Like

Great, thank you for confirming it!