CUDA Out of Memory Error with ESRGAN Upscaler

Good evening everyone :slight_smile:

This is my first time posting here and I’m a complete Python/Deep Learning noobie for the time being. so apologies in advance if I end up wasting your time after all with this whole Thread.

Anyways, some days ago I was able to successfully install the ESRGAN Upsaler (https://github.com/xinntao/ESRGAN) along with all of its prerequisites.

I tested my first image with my new RTX 2070 Super and the result came out in an astounding 4 seconds (considering that last time I tried it with a CPU it took 7 hours).

Unfortunately though, when I tried to replicate this whole process, the following error popped up in the Anaconda Prompt and which I would appreciate it extremely if anyone could help me resolve it:

Traceback (most recent call last):
File “test.py”, line 34, in
output = model(img_LR).data.squeeze().float().cpu().clamp_(0, 1).numpy()
File “C:\Users\ctamv\Anaconda3\lib\site-packages\torch\nn\modules\module.py”, line 541, in call
result = self.forward(*input, **kwargs)
File “D:\2414FC7A.Viber_p61zvh252yqyr!Viber.Universal\ESRGAN-master\RRDBNet_arch.py”, line 75, in forward
fea = self.lrelu(self.upconv2(F.interpolate(fea, scale_factor=2, mode=‘nearest’)))
File “C:\Users\ctamv\Anaconda3\lib\site-packages\torch\nn\functional.py”, line 2500, in interpolate
return torch._C._nn.upsample_nearest2d(input, _output_size(2))
RuntimeError: CUDA out of memory. Tried to allocate 5.21 GiB (GPU 0; 8.00 GiB total capacity; 3.01 GiB already allocated; 2.66 GiB free; 336.43 MiB cached)

I have been trying for hours until now to solve this problem after visiting multiple other threads, but with no success (mostly because I don’t even know where to input PyTorch commands in the fist place, as the Anaconda Prompt doesn’t let me run them…)

Apologies again for my ignorance and thanks in advance for your help :slight_smile:

Did you change anything in the setup or the script or are you just running exactly the same command on the same system?
In the latter case, could you check the GPU memory usage via nvidia-smi?
Are you using this GPU also for your video output on the machine?
I’m not familiar with Windows, but I assume the usage might vary depending on what you are executing.

1 Like

Good morning and thanks for the fast reply :slight_smile:

So, to be more exact, this is the test.py file of ESRGAN:

import os.path as osp
import glob
import cv2
import numpy as np
import torch
import RRDBNet_arch as arch

model_path = ‘models/RRDB_ESRGAN_x4.pth’ # models/RRDB_ESRGAN_x4.pth OR models/RRDB_PSNR_x4.pth
device = torch.device(‘cuda’) # if you want to run on CPU, change ‘cuda’ -> cpu

device = torch.device(‘cpu’)

test_img_folder = ‘LR/*’

model = arch.RRDBNet(3, 3, 64, 23, gc=32)
model.load_state_dict(torch.load(model_path), strict=True)
model.eval()
model = model.to(device)

print(‘Model path {:s}. \nTesting…’.format(model_path))

idx = 0
for path in glob.glob(test_img_folder):
idx += 1
base = osp.splitext(osp.basename(path))[0]
print(idx, base)
# read images
img = cv2.imread(path, cv2.IMREAD_COLOR)
img = img * 1.0 / 255
img = torch.from_numpy(np.transpose(img[:, :, [2, 1, 0]], (2, 0, 1))).float()
img_LR = img.unsqueeze(0)
img_LR = img_LR.to(device)

with torch.no_grad():
    output = model(img_LR).data.squeeze().float().cpu().clamp_(0, 1).numpy()
output = np.transpose(output[[2, 1, 0], :, :], (1, 2, 0))
output = (output * 255.0).round()
cv2.imwrite('results/{:s}_rlt.png'.format(base), output)

I have only changed the “cuda” variable here, along with the parth of the model I’m using.

Additionally, here’s the output of the nvidia-smi:

Sat Jan 11 10:38:26 2020
±----------------------------------------------------------------------------+
| NVIDIA-SMI 432.00 Driver Version: 441.87 CUDA Version: 10.2 |
|-------------------------------±---------------------±---------------------+
| GPU Name TCC/WDDM | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 GeForce RTX 207… WDDM | 00000000:01:00.0 On | N/A |
| 48% 29C P5 5W / ERR! | 760MiB / 8192MiB | 1% Default |
±------------------------------±---------------------±---------------------+

±----------------------------------------------------------------------------+
| Processes: GPU Memory |
| GPU PID Type Process name Usage |
|=============================================================================|
| 0 1860 C+G Insufficient Permissions N/A |
| 0 2200 C+G …mmersiveControlPanel\SystemSettings.exe N/A |
| 0 3360 C+G …oftEdge_8wekyb3d8bbwe\MicrosoftEdge.exe N/A |
| 0 4684 C+G …6)\Google\Chrome\Application\chrome.exe N/A |
| 0 4872 C+G …1.0_x64__8wekyb3d8bbwe\WinStore.App.exe N/A |
| 0 6596 C+G …10711.0_x64__8wekyb3d8bbwe\Video.UI.exe N/A |
| 0 6704 C+G …t_cw5n1h2txyewy\ShellExperienceHost.exe N/A |
| 0 8876 C+G …5n1h2txyewy\StartMenuExperienceHost.exe N/A |
| 0 8904 C+G …dows.Cortana_cw5n1h2txyewy\SearchUI.exe N/A |
| 0 9220 C+G …DIA GeForce Experience\NVIDIA Share.exe N/A |
| 0 9872 C+G …115.0_x64__8wekyb3d8bbwe\YourPhone.exe N/A |
| 0 10476 C+G …osoft.LockApp_cw5n1h2txyewy\LockApp.exe N/A |
| 0 13632 C+G C:\Windows\explorer.exe N/A |
| 0 15272 C+G …hell.Experiences.TextInput.InputApp.exe N/A |
| 0 16068 C+G …5.131.0_x64__kzf8qxf38zg5c\SkypeApp.exe N/A |
| 0 16216 C+G …0410.0_x64__8wekyb3d8bbwe\HxOutlook.exe N/A |
| 0 17116 C+G … Companion\Application\WebCompanion.exe N/A |
| 0 18252 C+G …cal\Programs\Microsoft VS Code\Code.exe N/A |
±----------------------------------------------------------------------------+

Finally yes, it is true that I’m also using this GPU as the video output for my PC.

Thanks once again in advance.

It seems a few processes are using the GPU memory as well. Could you kill them or restart the machine and run the script again?

1 Like

Okay, I can do that (even though I ran the script when my PC had just restarted and it still didn’t work).

But is all of this normal? Is there maybe a command which can free up some GPU memory and where can I run it?

Unfortunately, I’m not familiar with Windows and don’t know how to prevent applications from using the GPU memory.
You could use a small GPU just for the video output and use the bigger one for computation, if it’s a workstation and you are not planning to use it e.g. for gaming.

1 Like