Is NNAPI conversion an all-or-nothing in terms of operators?

Hi everybody, first post.
I have a question regarding NNAPI conversion: do all operators need to be supported by NNAPI for the conversion to be successful? I am coming from the Tensorflow Lite world, where NNAPI unsupported operators get pushed back to CPU (with obvious performance implications, but at least the model can be run).
Since torch.backends._nnapi.prepare.convert_model_to_nnapi doesn’t take any additional arguments, it seems it’s all-or-nothing. E.g. I was trying to convert MobileNet V3 to NNAPI, but getting an error that quantized::hardswish is not supported.

Yes @carrythetorch, it’s all the operators need to be supported at this point. They aren’t that hard to implement, we’re definitely open to PRs: pytorch/ at master · pytorch/pytorch · GitHub