How can I use O2 optimization with torch.cuda.amp like apex

it looks like there’s no opt_level option in torch.cuda.amp , so how to use opt_level=O2 in pytorch ?

torch.cuda.amp supports the O1 mode and an O2 mode is not available.

1 Like