Tensorflow and Pytorch inference speed problem

When I use MobileNetV2https://github.com/tonylins/pytorch-mobilenet-v2 as model backbone, I take a image as input, pytorch (0.4.1) 's inference time is about 150ms, tensorflow’s inference time is about 30mshttps://github.com/Zehaos/MobileNet(Not much difference in speed with Mobv1 and MobV2). So my question is what causes the difference in speed?

1 Like