Intermediate results in Checkpoint

Can we save intermediate results in checkpoint?
Intermediate results are important to save in checkpoint?

Which intermediate results do you mean?
Also, do you mean torch.utils.checkpoint or creating a checkpoint by serializing the model?

Does it save only learnable parameters or intermediate results also?

Could you explain, what you mean by “intermediate results”?

For example, in the transformer model

outputs are calculated in forward pass, and layer weights and bias are intermediate results.

correct me if I am wrong

The weight and bias would be parameters and I wouldn’t call them intermediate results.
I think the intermediate activations could be called intermediate results, but I’m still unsure if that’s really what you are looking for as I haven’t heard this naming so far.
By intermediate activations I mean e.g.:

def forward(self, x):
    intermediate_act1 = self.layer1(x)
    intermediate_act2 = self.layer2(intermediate_act1)

Thanks @ptrblck, now I got clarification