Hello,

trying to test libtorch with Qt following this useful answer from PyTorch forum, but when it comes to forward I’m catching the following exception:

`forward() Expected a value of type 'Tuple[Tensor, Tensor]' for argument 'inputs' but instead found type 'Tensor'.`

`Position: 1`

`Declaration: forward(__torch__.detectron2.export.caffe2_modeling.Caffe2RetinaNet self, (Tensor, Tensor) inputs) -> ((Tensor, Tensor, Tensor, Tensor, Tensor, Tensor, Tensor, Tensor, Tensor, Tensor, Tensor, Tensor, Tensor, Tensor, Tensor, Tensor))`

- The loaded model is a detectron2 exported via
`caffe_tracer.export_torchscript()`

. -
Caffe2Tracer input was just an image of
`torch.Size([3, 128, 80])`

- The input tensor used to forward the network has shape
`[3, 128, 80]`

, matching the tensor using to export the model. - The input is then unqueezed to shape
`[1, 3, 128, 80]`

**My question:**

How can I make a proper input tensor (`Tuple[Tensor, Tensor]`

) from my image?

Thanks.