Model = model.to(device) problem in the inference using other devices like MYRIAD

Hi there!

I loaded my model and data to Cuda during the training.
When I run the inference using another device (0) the output is not correct (always 0 values).

Does the Pytorch option to load the model require only the usage of CPU/GPU during the inference?

I want to understand from what the error comes

Thanks