Hi, using apex.amp or torch amp, is it possible that I switch between mixed precision training and full precision training after the training is started? For example, I might want the first 100 iterations to be trained with full precision, and switch to the mixed precision mode after the 100 iterations. Is this possible?
Yes, you could use the
enabled arguments in
torch.cuda.amp.autocast as well as in the creation of the
GradScaler and switch them to
True/False in your training run.