Torch.load(); meaning map_location=lambda storage, loc: storage

I understand that an usage of map_location is to make sure a model to be loaded on the device where it was trained. And I fount that map_location=labmda storage, loc: storage is a way of load model on CPU. But what’s the meaning of this lambda function? What are storage and loc here? How does it work?

2 Likes

For Posterity, torch.load — PyTorch 1.13 documentation