Model takes 2Mb on disk but 3Gb in RAM once loaded

I trained a model on GPU and saved it, following the official instructions https://pytorch.org/docs/stable/notes/serialization.html

torch.save(the_model.state_dict(), PATH)

Then when I try to load it with

the_model = TheModelClass(*args, **kwargs)
the_model.load_state_dict(torch.load(PATH, map_location='cpu'))

I see the RAM usage explodes. Is there something I need to be careful of during training? Model is not big, 2 RNN, a few fully connected layers and that’s it. Or is it normal behaviour?

Thank you