How to convert checkpoint file (.ckpt) to state dict file (.pth)?

Hi! New PyTorch user here :slight_smile:
I am trained my model using Pytorch Lighting and ModelCheckpoint with parameter save_top_k=1, so only the best checkpoint is saved. . After the training is finished I saved the model as usual with torch.save(model.state_dict()).
Now I want to deploy my model for inference. My epoch=42.ckpt file contains a model with better performance than the final model, so I want to use this checkpoint file. But the checkpoint file is three times larger than the normal model file (.pth).
How do I convert a .ckpt file to a .pth file?

2 Likes

CC @williamFalcon and @justusschock for Lightning questions :slight_smile:

what was the reply over converting checkpoint file to .pth file? Or we are going in wrong direction at all?

I’m not familiar with Lightning, but PyTorch shouldn’t change the file format based on the file extension.
Both files should contain the same data if you save them as .ckpt or .pth.

2 Likes

For a quick explanation, which can help other Lightning users! After a quite long experience with Lightning (which I enjoy), the hyper parameters are in hprams.yaml and the .ckpt (generated to the log_lightning folder) is exactly pth (simply the name differs) torch lightning keeps in the same file a little more information about hyperprams! you can upload your model and then take only state_dict of your model in the hyperprams that you do not want!