I only have a laptop GPU: NVIDIA RTX™ A2000 4 GB GDDR6,
how come the memory used can be 175172 MB as printed below?
|===========================================================================|
| PyTorch CUDA memory summary, device ID 0 |
|---------------------------------------------------------------------------|
| CUDA OOMs: 0 | cudaMalloc retries: 0 |
|===========================================================================|
| Metric | Cur Usage | Peak Usage | Tot Alloc | Tot Freed |
|---------------------------------------------------------------------------|
Allocated memory | 0 B | 1159 MB | 175172 MB | 175172 MB |
| from large pool | 0 B | 1158 MB | 170754 MB | 170754 MB |
| from small pool | 0 B | 4 MB | 4417 MB | 4417 MB |
|---------------------------------------------------------------------------|
| Active memory | 0 B | 1159 MB | 175172 MB | 175172 MB |
| from large pool | 0 B | 1158 MB | 170754 MB | 170754 MB |
| from small pool | 0 B | 4 MB | 4417 MB | 4417 MB |
|---------------------------------------------------------------------------|
| GPU reserved memory | 876 MB | 1186 MB | 2736 MB | 1860 MB |
| from large pool | 872 MB | 1180 MB | 2720 MB | 1848 MB |
| from small pool | 4 MB | 6 MB | 16 MB | 12 MB |
|---------------------------------------------------------------------------|
| Non-releasable memory | 0 B | 105782 KB | 104516 MB | 104516 MB |
| from large pool | 0 B | 102456 KB | 98883 MB | 98883 MB |
| from small pool | 0 B | 4054 KB | 5633 MB | 5633 MB |
|---------------------------------------------------------------------------|
| Allocations | 0 | 264 | 155180 | 155180 |
| from large pool | 0 | 33 | 8510 | 8510 |
| from small pool | 0 | 232 | 146670 | 146670 |