RuntimeError: number of dims don't match in permute

Hi everyone!
As i mentioned in the topic, I got an error with the permute function.

    def __getitem__(self, i: int) -> Tuple[torch.Tensor, torch.Tensor]:
        """ Input-Output Generator by index
        Args:
            i (int): Order of data
        Returns:
            (Tuple[torch.Tensor, torch.Tensor]): Image and mask PyTorch tensor
        """

        _id = self.__image_ids[i]
        paths = self.__id2path(_id)

        image = self.__load_n_preprocess(paths["image_path"], False)
        mask = self.__load_n_preprocess(paths["mask_path"], True)

        image = torch.from_numpy(image)
        mask = torch.from_numpy(mask)
        image = image.permute(2, 0, 1)
        mask = mask.permute(2, 0, 1)

        return image, mask

ERROR

    mask = mask.permute(2, 0, 1)
RuntimeError: number of dims don't match in permute

I realized that It gets stuck on mask.permute after doing image.permute just fine. By the way image’s shape is torch.Size([256, 256, 3]) mask’s shape is torch.Size([256, 256]).

Also, this error arised when this function was called the third time.

This is the issue – the mask is 2-dimensional, but you’ve provided 3 arguments to mask.permute().

I am guessing that you’re converting the image from h x w x c format to c x h x w. However, looks like the mask is only in an h x w format.

2 Likes

Yeah, you are right. I changed it to 2 dimension. Thanks!