@fmassa @ptrblck
Hello. I would like to ask a few questions about the behavior of torch.backends.cudnn.benchmark = True
.
- Does the mini-batch size matter? Many people say that benchmarking uses the same cache if image input size is the same. However, I have not found a clear explanation of whether changing batch size is OK.
- How many caches can it manage? For example, I might have two types of input: 224x224 and 320x320. Would changing between the two types of images constantly require additional benchmarking or would there be two separate caches?
Thank you in advance for your replies!