Serializing a model to disk

I am saving my entire model, optimizer parameter etc. to file every couple of epoch via

states = dict()

    state = {
            'model': net.module.state_dict(),
            'optim': optimizer.state_dict(),
            'acc': acc,
            'loss' : loss,
        }

        states[epoch] = state

        torch.save(states, './{}/states.tar')

So basically I have three cascaded dictionaries and according to the output the data is saved correctly. But how is it, that the resulting file size does not increase with proceeding training? It always remains at ~76MB no matter what.

This does not save the history of previous states/parameters. So, always the last (current) state is saved, which, as a result, makes sense that the file size is always the same.

If you want to save the history, then you can save the states after each epoch in different file names. One file for each epoch, for example.

Any why does my approach not save the entire history?
It should be for all I know…

is there no way other than using unique filenames, so store the training history?

I just checked:

for epoch in checkpoint:
   print(checkpoint[epoch]['loss'])

yields the entire history of loss values over all epochs that are saved. I assume it is the same for the model and the optim dictionaries. But as I said before I am then confused as to why the file size does not grow over time…