Does Pytorch do garbage collection for modules whose parameters were registered and added to an optimizer but who are no longer referenced anywhere?

Are those parameters still tracked?

Unreferenced tensors should be freed by default. Are you seeing any issues with it? If so, did you make sure to delete all references?

Not seeing any issues. However, it’s a whole Module (well, a sub-module, whose parameters are added to the super-module’s registered parameter’s and the optimizer’s param_group, but is then later deleted), not a tensor.