[Caffe2] Dropout modules exported from PyTorch with ONNX: BlobIsTensorType(*blob, CPU). Blob is not a CPU Tensor: [BlobNumber]

Hello, I’ve noticed a problem when using workspace.Predictor to execute models exported from PyTorch with the ONNX and the Caffe2 mobile_exporter, as shown here: https://nbviewer.jupyter.org/github/cedrickchee/data-science-notebooks/blob/master/notebooks/deep_learning/fastai_mobile/shipping_squeezenet_from_pytorch_to_android.ipynb#Fast.ai-Mobile-Camera-Project

An error which reads “BlobIsTensorType(*blob, CPU). Blob is not a CPU Tensor: [BlobNumber]” is generated when trying to execute a model which has a Dropout module in it (e.g. MobileNetV2), with BlobNumber being the id number of the Dropout blob in the model (verifiable by printing it in human-readable form).

I don’t know if there’s any solution for it besides than removing the dropout module when building the model (which I’ve tried and works). It specially affects when trying to run the model inside an Android App, as the Android Studio Caffe2 code uses predictor to execute models imported from protobufs.

This issue has been mentioned previously in the old Caffe2 GitHub repo: https://github.com/facebookarchive/caffe2/issues/2165.