Can torch.load run lazy to save memory?

Hello,I use a user defined trainer class to train a torch model with other regular method such as eval, predict etc.
The trainer init method will check certain checkpoint path from args,if exists it will load the checkpoint,so object such as model,optimizer,lr_scheduler will all loaded.This is for failover so other framework like ray actor can reload the checkpoint to continue other operating such as fit.
Then I will call the trainer method such as fit, eval, predict etc.
But I only need optimizer, lr_scheduler when I call fit.If I call predict,I will only need model.
So I want to use some lazy operation to wrap the checkpoint so that I will only load the object when I really need it,this will consume more io, but can save memory.
But it seem the default torch load can’t do this,how can I do so?