Hi, I notice my jupyter notebook is using super large memory (~40GB) when it is running, and after using the tool here: How to debug causes of GPU memory leaks? - #2 by SpandanMadan, I found most memory is used by some intermediate tensor variables. For instance, if I have a large numpy array X
, and pass it to a network net
as net.cpu()(torch.tensor(X))
, then this intermediate variable torch.tensor(X)
stays in memory forever even after this function call finished.
I am wondering if there is a way to keep only “named” variables/tensors in memory, while automatically free anything that are intermediate results?
Thank you!