I have a weird issue when appending tensors to a list.
I have a code similar to this:
tensor_list = []
for i in range(24):
current_tensor = function (...)
tensor_list.append(current_tensor)
#print (tensor_list)
print (tensor_list)
when I uncomment the print inside the loop, the values in the last print are correct. But when I comment it, I get wrong values in the final print. Any idea what could cause this?
The change I have made is in “BertModel” class which I add a loop for the encoder function call (line 731).
The issue is that when I print something in the loop, it changes the list values.
I think I solved that. I was using cuda_stream for a tensor transfer before the function call. When I removed it, the values are correct. Maybe a bug in cuda_stream?
I’m glad you solved the issue.
If you are manually using CUDA streams, you would have to take care of the synchronizations as described here, which might explain this issue, if you haven’t done so.