Hi,
So I have a dataset class responsible to read from two directories having different sizes. I have the following code for my dataset class:
def __getitem__(self, idx):
idx_ood = random.randint(0,5374)
image = Image.open(
os.path.join(
self.path_to_images,
self.df.index[idx]))
image = image.convert('RGB')
image_ood = Image.open(self.ood_names[idx_ood])
image_ood = image_ood.convert('RGB')
label = np.zeros(len(self.PRED_LABEL), dtype=int)
for i in range(0, len(self.PRED_LABEL)):
# can leave zero if zero, else make one
if(self.df[self.PRED_LABEL[i].strip()].iloc[idx].astype('int') > 0):
label[i] = self.df[self.PRED_LABEL[i].strip()
].iloc[idx].astype('int')
if self.transform:
image = self.transform(image)
image_ood_tr = self.transform(image_ood)
if torch.any(torch.isnan(image_ood_tr)):
print("NAN in ood input image!")
return (image, label,self.df.index[idx]),(image_ood_tr,idx_ood,self.totensor(image_ood))
When I am fetching data via data loader, one of the images has nan at an element inside the tensor. While debugging, I checked the original image before transform and it does not have any nan. When I try to run the code again, there was no nan in the same image tensor.
Could this be a hardware issue? Or can you find any error?
I would be grateful for your help.