Hi community,
I have a question regarding validation loss. I have train, validation, and test in my algorithms.
I put train and validation in one train function. When in train, I got model.train(). When I validation, I got model.eval().
However, I am not sure how should I deal with loss.rquires_grad and loss.backard in model.eval(). Should I remain loss.requires_grad = True and loss.backward() in validation?
loss.requires_grad = True
loss.backward()