Memory leaking for dataloader

When I tried to use the dataloader provided in “https://github.com/PatrickTUM/UnCRtainTS/blob/main/data/dataLoader.py,” I noticed a memory leak issue. The dataloader gradually consumes all my RAM, eventually causing my desktop to freeze.

some experiments I did to fix it

(1) I tried to transfer all the python list to np.array but It does not work.
(2) I tried to just use the dataloader without any training but still get memory leaking. Like:
sen12mscr_test = SEN12MSCR(opt.data_root, split=‘test’, cloud_masks=‘s2cloudless_map_mask’)
test_dataloader = torch.utils.data.DataLoader(sen12mscr_test, batch_size=2, shuffle=False, num_workers=0, pin_memory=True)
for pdx, samples in enumerate(test_dataloader):
print(pdx)

@Yuyang_Hu the file is not available at the mentioned link. Can you re-post the file in the github link?

https://github.com/PatrickTUM/UnCRtainTS/blob/main/data/dataLoader.py It is the correct link.

[quote=“Yuyang_Hu, post:3, topic:205626”]

[/quote] Sorry, It is the correct one. Thanks.