Saving while training on GPU always adds module.conv etc

I am training on a GPU and every time I validate my model after every epoch. It ends up adding module at the head of each string variable in the state_dict dictionary. I can use a for loop and depending on the number of eopchs remove the module keyword from the head of each string using a for loop. But is there a better way to do it?