Get total memory of GPU

I checked all methods in here https://pytorch.org/docs/stable/cuda.html#module-torch.cuda and could not find single method that can bring me the GPU size of the device.

How to get total memory of GPU device and total available GPU memory on device?

print(torch.cuda.get_device_properties('cuda:0'))
> _CudaDeviceProperties(name='TITAN V', major=7, minor=0, total_memory=12034MB, multi_processor_count=80)

will give you some information about your device.
Note that the CUDA context (ant other application) might take some memory, which will not be tracked by e.g. torch.cuda.memory_allocated() or torch.cuda.memory_cached().

1 Like