mark87
(mark87)
November 7, 2022, 3:20pm
1
I understand the meaning of this command (PYTORCH_CUDA_ALLOC_CONF=max_split_size_mb:516), but where do you actually write it?
In jupyter notebook? In command prompt?
1 Like
export
it as an env variable in your terminal and it should work.
mark87
(mark87)
November 7, 2022, 7:27pm
4
Thank you for the answer, but now I am even more confused where should I write what.
gahaalt
(Szymon Mikler)
November 7, 2022, 7:31pm
5
You can set environment variables directly from Python:
import os
os.environ["PYTORCH_CUDA_ALLOC_CONF"] = "max_split_size_mb:516"
This must be executed at the beginning of your script/notebook.
1 Like
mark87
(mark87)
November 7, 2022, 8:56pm
7
Didn’t work. Still get CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 6.00 GiB total capacity; 5.33 GiB already allocated; 0 bytes free; 5.34 GiB reserved in total by PyTorch)
First, use the method mentioned above.
in the linux terminal, you can input the command:
export PYTORCH_CUDA_ALLOC_CONF=max_split_size_mb:512
Second,
you can try --tile following your command.
"decrease the --tile
such as --tile 800 or smaller than 800"
opened 02:18PM - 27 Sep 21 UTC
closed 12:20AM - 29 Sep 21 UTC
Thank you first of all for this awesome program!
I'm getting the following er… ror message if I run your program:
```
Testing 0 place
/home/tornax/Apps/Real-ESRGAN/inference_realesrgan.py:84: UserWarning: The input image is large, try X2 model for better performance.
warnings.warn('The input image is large, try X2 model for better performance.')
Error CUDA out of memory. Tried to allocate 1.20 GiB (GPU 0; 3.94 GiB total capacity; 1.54 GiB already allocated; 1.22 GiB free; 1.74 GiB reserved in total by PyTorch)
Tile 1/1
Error local variable 'output_tile' referenced before assignment
If you encounter CUDA out of memory, try to set --tile with a smaller number.
```
I have 12GB of RAM and it's fine for me if it's using all of it. How can I set the allowed RAM usage?
I've tried to call it like this:
```
python inference_realesrgan.py --tile 12000000 --model_path experiments/pretrained_models/RealESRGAN_x4plus_anime_6B.pth --input inputs --outscale 0 ```
but it doesn't work as well (I get the same error message).
eg.
python inference_realesrgan.py --tile 120
I’ve tried these 2 steps, and it works for me.
My Chinese blog: