I don’t know if you noticed, but you have asked about the exact same error message in a post 2 days ago.
RuntimeError: stack expects each tensor to be equal size, but got [3, 288, 352] at entry 0 and [3, 256, 256] at entry 1
And you have already gotten an answer to it.
This answer still applies here. The only difference is that you are now using a custom dataset.
If you want a specific answer for your custom dataset:
In utils/dataset.py
you resize images based on a constant scaling factor instead of a fixed size:
def preprocess(cls, pil_img, scale):
w, h = pil_img.size
newW, newH = int(scale * w), int(scale * h)
assert newW > 0 and newH > 0, 'Scale is too small'
pil_img = pil_img.resize((newW, newH))
I am guessing that the images you are using are not all the same size. If you then preprocess them by simply scaling each of them with a constant factor the will stay differently sized.
You either need images of the same size for this scaling to work or you resize every image to a given size. Which is also what was suggested to you in the above mentioned previous post you did about this topic.