Hi all,
I would like to run one of the detection models on a batch from a dataset.
I am fairly new to torch (though I like it so far and am very comfortable with tensorflow), I tried to use one of the faster-rcnn images to predict on a batch of images.
when calling model(dataloader) I get the following error:
RuntimeError: stack expects each tensor to be equal size, but got [3, 960, 720] at entry 0 and [3, 960, 1021] at entry 2
Is the model complaining or the dataloader? It seems like it’s the dataloader?
Can it not work with varying image sizes?
thank you for hints and help!