Hi, Deeply, Thank you for your reply!
The problem I met is that,
If I use DataLoader later like this:
train_set = torch.utils.data.ConcatDataset( [train_set1, train_set2])
train_loader = torch.utils.data.DataLoader(train_set, batch_size=128, shuffle=True)
for i, (features, labels) in enumerate(train_loader):
train_features =features
train_labels = labels
there will be an error:
" RuntimeError: stack expects each tensor to be equal size".
As in traditional DataLoader, all the samples in the train_set need to have the same size.
If I modify my collate_fn function according to this post: How to create a dataloader with variable-size input. The batch of training sample data is a large List, while The input of the CNN model can only be Tensor.
So I meet another error:
"TypeError: conv1d(): argument 'input' (position 1) must be Tensor, not list"
Do you know how to solve this problem?