Hi, I have large evaluation data set, which is the same size as the training data set and I’m performing the validation phase during training to be able to control the behavior of the training process.
I’ve added automatic mixed precision in the training phase, but is it safe to wrap the validation step in the training process with amp.autocast() to speed up that forward propagation?
In general, is it safe/recommended to use mixed precision in model evaluation during the tuning process and if it is, what is the right way to implement?
for epoch in range(epochs):
# Training phase
train_loss, train_score = self.train_model(trainset)
# Validation phase
valid_loss, valid_score = self.valid_model(validset)
@torch.no_grad()
def valid_model(self, dataloader):
self.eval()
for batch in tqdm(dataloader):
# Evaluate with mixed precision
if self.setting.mixed_precision:
# Runs the forward pass with autocasting, including loss and score calculation
with amp.autocast():
loss, score = self.validation_step(batch)