RuntimeError: Error when trying to collate the data into batches with fa_collate, at least two tensors in the batch are not the same size. Please include a transform in after_item that ensures all data of type Tensor is the same size

I am working on different size of image dataset. While preprocessing I got this error,
RuntimeError: Error when trying to collate the data into batches with fa_collate, at least two tensors in the batch are not the same size.

Mismatch found on axis 0 of the batch and is of type Tensor:
Item at index 0 has shape: torch.Size([1, 114, 114])
Item at index 1 has shape: torch.Size([3, 161, 162])

Please include a transform in after_item that ensures all data of type Tensor is the same size
Please tell me how to solve this error as I am first time see this type error. Thank you in advance.

'''
class MTLDataset(Dataset):
def __init__(self,df, transform=None):
    self.path = list(df.path)
    self.age = list(df.age)
    self.gender = list(df.gender)
    #self.ethnicity = list(df.ethnicity)
    self.transform = transform

def __len__(self): 
    return len(self.path)

def __getitem__(self,idx):
    #image
    path = self.path[idx]
    image = read_image(path)
    image = convert_image_dtype(image,dtype = torch.float32)
    if self.transform:
        image = self.transform(image)
    #age,gender,ethnicity
    age = self.age[idx]
    gender = self.gender[idx]
    #ethnicity = self.ethnicity[idx]
    
    return image, age, gender
'''
'''
def generate_dataloader(df, transformation = None):
  ds = MTLDataset(
  df,
  transformation
)

return DataLoader(
  ds,
  batch_size = 100,
  num_workers = 4,
)
'''
'''
transformation = transforms.Compose([
transforms.Resize((100, 100)),
transforms.ToTensor(),
# transforms.Normalize(mean=[0.485, 0.456, 0.406],
#                      std=[0.229, 0.224, 0.225])
])
'''
'''
train_dataloader = generate_dataloader(train_df)
test_dataloader = generate_dataloader(test_df)
'''
'''
train_features, train_age, train_gender = next(iter(train_dataloader))
print(f"Feature batch shape: {train_features.size()}")
print(f"Feature: {train_features[0]}")
'''

This is my code. I tried to solve this error but I Couldn’t.

You could either resize the images to the same shape thus allowing the DataLoader to stack them to a single batch or you could create a custom collate_fn which would return e.g. a list containing image tensors in different shapes. In the latter case you would need to make sure your model is able to work with this data structure as usually also a single input batch would be expected.

The error message also seems to come from a higher-level API unless I’m not familiar with the after_item call.