Deleting intermediate activations

Hey there,

Does anyone know if there exists a way to delete intermediate activations during forward propagation? More specifically, I only want to delete the intermediate activation data but retain the grad_fn. Assume for now, that I have a way of introducing the activations back when they are needed during back prop to compute gradients.

Please let me know if any part of my question doesn’t make sense.

Thanks!