AttributeError: module 'torch.cuda' has no attribute 'amp'

I’m running from torch.cuda.amp import GradScaler, autocast and got the error as in title.

Pytorch version: - 1.4.0



Have you installed the CUDA version of pytorch? Does your environment recognize torch.cuda?

torch.cuda.amp is available in the nightly binaries, so you would have to update.

1 Like

I just got the following error when attempting to use amp. Do you know how I can fix it?

     84         if amp_enable:
---> 85             with th.cuda.amp.autocast():
     86                 out1 = model(sub, inp)
     87                 out2 = temp_ly(sub, out1)

AttributeError: module 'torch.cuda.amp' has no attribute 'autocast'

You might need to install the nightly binary, since Autocasting wasn’t shipped in 1.5.