Pytorch predict memory problem


(suke) #1

Hi
when I run test program in pytorch, I found the memory seems can not free completely
my test program as below:
self.model.eval()
for i in range(0,img_cnt) :
img = misc.imread(load_name)
input = img
#prepare data
input = self.prepareData(input, target, volatile=True)
output = self.model(input)
I found for when I test 1920x1080 images, if I run it one by one it is ok. if I test it together, then will warns me out of memory.
anyone know how can I free memory manually


(jpeg729) #2
del input, output
gc.collect()

(suke) #3

it seems not work.
my mode is load one time, it is valid when for loop runs, should I do reset for model


(suke) #4

I try to improve code again, it seems work now.
I have one more question, when I do memory check after "output = model(input)"
it will also cost about 2G memory, is it temp feature map memory cost, should I be freed immediately when computing done?


(jpeg729) #5

I’m not sure I understand the question.


(suke) #6

my code like below
output = self.model(input) #predict

#memory check
import gc
import resource
del input, output
gc.collect()
max_mem_used = resource.getrusage(resource.RUSAGE_SELF).ru_maxrss
print("{:.2f} MB".format(max_mem_used / 1024))

print always print about 2000M
is there any other memory I need free manually?


(jpeg729) #7

None that I can think of.


(Shaohua Li) #8

Met the same problem a couple of days ago. This error happens because you use a different tensor for each iteration. As the input tensor changes, Pytorch creates a different computation graph in each iteration. For some reason I still don’t know, Pytorch doesn’t release the graph immediately. Soon the GPU RAM will run out. Solution: use the same tensor, and copy the input image into that tensor in each iteration.


(suke) #9

Hello, thank you for reply
do you mean I need set a placeholder(like tensorflow do). and transfer image data to placeholder?


(Shaohua Li) #11

yeah that’s exactly what I mean :grinning:


(suke) #12

Hello, one more question is gc module may control memory but not GPU memory. as I think in case of GPU predict case, most computing jobs are done by gpu. if so why there are so big memory cost. I am confused for it.