Using torch.load on a torch hub model

I am currently playing around with the DETR object detection toolkit and I am downlloading the pretrained models as:

 model = torch.hub.load('facebookresearch/detr', 'detr_resnet50', pretrained=True)

Now it downloads files locally in the pytorch cache directtory. My question is would it be possible to load this model using the torch.load method subsequently? My use-case is that I would like to sort of us these models offline and once they are downloaded what would be the best way to use them without replying on torch.hub?

Is it possible to generate some torch jiit traced model from this, which I can simply use with torch.jit.load or something similar?

Have you tried downloading the DETR model directly from the facebookresearch repo on github and loading it?

state_dict = torch.load("detr-r50-e632da11.pth")

To load the dict to the model this detr-demo-notebook suggests building a class and instantiating it minimally (with just the number of classes) and then calling the method load_state_dict(), which is not defined in the notebook.

state_dict = torch.hub.load('facebookresearch/detr', 'detr_resnet50', pretrained=True)

I’m still figuring out how to avoid a dict mismatch (‘missing keys’, etc.).