DataParallel with test batch size = 1

I have trained a model using DataParallel that used 2 GPUs and train_batch_size = 16. Now for certain unavoidable reasons, I need to test the model with test_batch_size = 1. My question is; is it possible to do that?

Because, with test_batch_size = 1, I’m getting below error:

ValueError: not enough values to unpack (expected 2, got 1)

So, is this error due to test_batch_size = 1 and if yes how can I test model with test_batch_size=1 considering I’ve used DataParallel and train_batch_size > 1?

You could use the internal model directly via model.module and pass the sample directly to it.
However, the error message suggests the error is caused while unpacking an object and nn.DataParallel is able to accept a single sample in the common use case as seen here:

import torch
import torch.nn as nn

class MyModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc = nn.Linear(1, 1)

    def forward(self, x):
        print('forward on device {}'.format(x.device))
        out = self.fc(x)
        return out

model = MyModel()
model = nn.DataParallel(model).cuda()

x = torch.randn(8, 1).cuda()
out = model(x)

print('single sample')
x = torch.randn(1, 1).cuda()
out = model(x)

Output:

forward on device cuda:0
forward on device cuda:1
forward on device cuda:2
forward on device cuda:3
forward on device cuda:4
forward on device cuda:5
forward on device cuda:6
forward on device cuda:7
single sample
forward on device cuda:0