Converting a python list to tensor

I am trying to convert a list to tensor but I got an error.

ValueError: expected sequence of length 631 at dim 1 (got 284)

This code computes the pixel values for each detected object in the image.

imgs = PIL.Image.open(img_path).convert('RGB')
image_width, image_height = imgs.size
imgArrays = np.array(imgs)

     
X = (xCenter*image_width)          
Y = (yCenter*image_height)        
W = (Width*image_width)          
H = (Height*image_height)       

cropped_image = np.zeros((image_height, image_width, 3))

for i in range(len(X)):

           x1, y1, w, h = X[i], Y[i], W[i], H[i]

          x_start = int(x1 - (w/2))

          y_start = int(y1 - (h/2))

          x_end = int(x_start + w)

          y_end = int(y_start + h)

            
          temp = imgArrays[y_start: y_end, x_start: x_end]

            
          cropped_imagesList.append(temp)

          cropped_images = torch.as_tensor(cropped_imagesList)

The error sounds to me as if rows in the prospective tensor are not of the same length.
Tensors are strictly regular in this respect, if you want to process data of different lengths, you would need to resort to padding or something like that.

Best regards

Thomas

Sir, I do resize for all the images but I still got the error:

RuntimeError: Sizes of tensors must match except in dimension 1. Expected size 5 but got size 31104000 for tensor number 1 in the list.

This means that the shape of the second tensor (at index 1) is incompatible with the shape of the first.