Hello,everyone.
When I tried to freeze part of my model in the process, I set require_grad to False, but I found that memory increased by more than two times at this point. I wonder if PyTorch will save the weight at once and copy it to continue execution, and if so, is there a better way to save my memory since GPU cannot work now.
I tracked my memory usage and it worked until it was frozen.
At Backbone_idsub <module>: line 202 Total Used Memory:8267.4 Mb
+ | 52 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
+ | 44 * Size:(1,) | Memory: 0.0001 M | <class 'torch.Tensor'> | torch.float32
- | 42 * Size:(1,) | Memory: 0.0001 M | <class 'torch.Tensor'> | torch.float32
- | 50 * Size:() | Memory: 0.0001 M | <class 'torch.Tensor'> | torch.float32
At Backbone_idsub <module>: line 202 Total Used Memory:8267.4 Mb
+ | 54 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
+ | 46 * Size:(1,) | Memory: 0.0001 M | <class 'torch.Tensor'> | torch.float32
- | 52 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 44 * Size:(1,) | Memory: 0.0001 M | <class 'torch.Tensor'> | torch.float32
At Backbone_idsub <module>: line 202 Total Used Memory:8267.4 Mb
+ | 48 * Size:(1,) | Memory: 0.0001 M | <class 'torch.Tensor'> | torch.float32
+ | 56 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 54 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 46 * Size:(1,) | Memory: 0.0001 M | <class 'torch.Tensor'> | torch.float32
At Backbone_idsub <module>: line 202 Total Used Memory:8269.6 Mb
+ | 50 * Size:(1,) | Memory: 0.0001 M | <class 'torch.Tensor'> | torch.float32
+ | 58 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 48 * Size:(1,) | Memory: 0.0001 M | <class 'torch.Tensor'> | torch.float32
- | 56 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
At Backbone_idsub <module>: line 202 Total Used Memory:8267.4 Mb
+ | 60 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
+ | 52 * Size:(1,) | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 50 * Size:(1,) | Memory: 0.0001 M | <class 'torch.Tensor'> | torch.float32
- | 58 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
At Backbone_idsub <module>: line 202 Total Used Memory:8267.4 Mb
+ | 62 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
+ | 54 * Size:(1,) | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 60 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 52 * Size:(1,) | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
At Backbone_idsub <module>: line 202 Total Used Memory:8267.4 Mb
+ | 56 * Size:(1,) | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
+ | 64 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 62 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 54 * Size:(1,) | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
At Backbone_idsub <module>: line 202 Total Used Memory:8267.4 Mb
+ | 58 * Size:(1,) | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
+ | 66 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 56 * Size:(1,) | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 64 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
At Backbone_idsub <module>: line 202 Total Used Memory:8267.4 Mb
+ | 68 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
+ | 60 * Size:(1,) | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 58 * Size:(1,) | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 66 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
At Backbone_idsub <module>: line 202 Total Used Memory:8267.4 Mb
+ | 70 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
+ | 62 * Size:(1,) | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 68 * Size:() | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
- | 60 * Size:(1,) | Memory: 0.0002 M | <class 'torch.Tensor'> | torch.float32
At Backbone_idsub <module>: line 202 Total Used Memory:20112.1Mb
Looking forward to your reply :).