def stack_features(features, something):
return torch.stack([torch.stack(ft, dim=0) for ft in features], dim=0):
but I get this error:
line 100, in stack_features
return torch.stack([torch.stack(pad_features(ft), dim=0) for ft in features], dim=0)
RuntimeError: stack expects each tensor to be equal size, but got [1024, 7] at entry 0 and [1024, 3] at entry 1
So I tried to solve it by making a collate function:
def collate_fn(ft):
return tuple(zip(*ft))
def stack_features(features, something):
return torch.stack([torch.stack(collate_fn(ft), dim=0) for ft in features], dim=0)
but now I am getting this error:
TypeError: expected Tensor as element 0 in argument 0, but got tuple
Any ideas on how to solve it? Thank you in advance
I don’t understand how the collate_fn is supposed to solve the issue since you cannot stack tensors in dim0 if their size differs in dim1, so you might want to resize, slice, or pad the tensors.
No, BucketIterator does not provide the drop_last argument as it’s creating batches with similar lengths to minimize padding. I also don’t know why you would need to increase the batch size to avoid this error. Is the batch dimension missing? If so, you can unsqueeze it.