Questions about TenCrop/FiveCrop

I am trying to deal increase my resnet50’s proformance on a relatively small size of training examples and would like to use TenCrop to enlarge my dataset.
Knowing that TenCrop may cause the problem of 4D/5D, I copied the codes on the document.However, this still doesn’t works and I got following error:
RuntimeError: Expected 4-dimensional input for 4-dimensional weight [64, 3, 7, 7], but got 5-dimensional input of size [16, 10, 3, 224, 224] instead

Can I have any idea how to fix this?

Thanks in advance!

I think you’re missing the later part where you roll out the stacked tensor along the first (minibatch) dimension, marked with !! in the comments below. Not sure how to do this as part of a transforms chain but here’s an example below of how to do it as part of your forward pass.

Also remember that at the end you’ll have X labels and 10 * X amount of data now, so you have to do some kind of collapsing function at the end to get the 10 * X model outputs back to X ultimate labels.

Here’s an example using FiveCrop (so the 5 below would be instead 10 if you used TenCrop):

for i_step, sample in enumerate(train_loader):
    inputs = sample[0].to(device)
    inputs = torch.stack(transforms.FiveCrop(512)(inputs), dim=0)  # this results in a (5, batchsize, 3, 512, 512) shaped Tensor
    inputs = inputs.view(-1, 3, 512, 512)  # !! i think you were missing this !! rolling out the tensor along the first (minibatch) dimension, effectively treating this as if it were a minibatch of size 5 * batchsize
    inputs = transforms.Resize((HEIGHT, WIDTH))(inputs)
    net_output = net(inputs)
    net_output = net_output.view(5, -1, n_classes)
    net_output = net_output.max(dim=0)  # This "collapses" the output back onto the desired size.
    loss = criterion(net_output, sample[1].to(device))
    loss.backward()
    optimizer.step()

Hope this helps!

Thank you so much!
this really helps me a lot :smiley: