This may be a dumb question. The answer is probably that even test data sets are too large to fit in memory. But I would still like to know why we use a data loader for the test set.
As far as I understand it, the test set is used for calculating metrics and perhaps error analysis. There are no operations that need to be carried out on mini-batches. Can someone please confirm the reason?
Example (MNIST):
test_loader = torch.utils.data.DataLoader(
datasets.MNIST('../data', train=False, transform=transforms.Compose([
transforms.ToTensor(),
transforms.Normalize((0.1307,), (0.3081,))
])),
batch_size=args.test_batch_size, shuffle=True, **kwargs)
I also have a follow-up question to askā¦