Can I use 'torch.cuda.amp.autocast' with 'torch.einsum'?

Can I use ‘torch.cuda.amp.autocast’ with ‘torch.einsum’ ? Will this work ?

since einsum is a fairly dynamic op, i think amp.autocast might keep it in the “safe” region and do the computation in fp32.

It should be quick to try and find out.