How can I train a model with only fp16?
the same operation with apex opt_level=“03” not mixed precision
The deprecated apex.amp
opt_level="O3"
was using “pure” FP16, so you can just call .half()
on your model and input data in your training script.
if so, should I remove the amp.auto-cast() and gradscale?
also is there any way to converge model? the model only predicts zero or one in binary classification…
Yes, you won’t need to use autocast
, since you are explicitly skipping the mixed-precision training util. Note that pure FP16 training is generally not stable for training.