GPU memory problem

I am using pytorch to train my model on nvidia rtx 3050 for training everything is fine but when I tried to run the inference it stopped like this


I checked the code and there is no errors ,
and before it was running fine
any suggestions !

What kind of memory problem are you seeing?
Your script seems to hang, so I’m unsure how it’s related to the GPU and its memory.

when I used 112112 sample size for training and inference everything worked fine and when I doubeled 224224 at first it work fine but after the script start to hang like that when I tried again with 112*112 it worked .
I assumed that was caused by the memory
.If its not what could be the problem
and if I want to check the memory used or free it all what should I do