I’m using the following code :
vgg_model = models.vgg19_bn(pretrained=True)
If I crop my images to 240*240, the network works fine, but for the following sizes, it throws a size mismatch error
(3L, 480L, 640L)
DAMN --------------------
size mismatch, m1: [1 x 153600], m2: [25088 x 4096] at /pytorch/torch/lib/TH/generic/THTensorMath.c:1293
(3L, 426L, 640L)
DAMN --------------------
size mismatch, m1: [1 x 133120], m2: [25088 x 4096] at /pytorch/torch/lib/TH/generic/THTensorMath.c:1293
(3L, 428L, 640L)
DAMN --------------------
size mismatch, m1: [1 x 133120], m2: [25088 x 4096] at /pytorch/torch/lib/TH/generic/THTensorMath.c:1293
(3L, 425L, 640L)
DAMN --------------------
size mismatch, m1: [1 x 133120], m2: [25088 x 4096] at /pytorch/torch/lib/TH/generic/THTensorMath.c:1293
(3L, 640L, 481L)
DAMN --------------------
size mismatch, m1: [1 x 153600], m2: [25088 x 4096] at /pytorch/torch/lib/TH/generic/THTensorMath.c:1293
Any ideas as to how to correct this?