Jit runs 3x slower than PyTorch

for j in range(100):
    start = time.time()
    torch.cuda.synchronize()

    _ = model(inputs)
    torch.cuda.synchronize()

    end = time.time()
    print('torch infer time: ', (end-start))

script_module = torch.jit.script(model.cuda().eval(),inputs)

for j in range(100):
    start = time.time()
    torch.cuda.synchronize()

    _ = script_module(inputs)
    torch.cuda.synchronize()

    end = time.time()
    print('torch jit infer time: ', (end-start))

pytorch forward time is about 100ms, but jit forward time is about 330 ms, any suggestions?

torch infer time: 0.10617709159851074
torch infer time: 0.10543274879455566
torch infer time: 0.10464048385620117
torch infer time: 0.10502433776855469
torch infer time: 0.10710620880126953
torch infer time: 0.10780525207519531
torch infer time: 0.11376476287841797
torch infer time: 0.10918307304382324
torch infer time: 0.10780143737792969
torch infer time: 0.10900402069091797
torch infer time: 0.10415053367614746
torch infer time: 0.10154843330383301
torch infer time: 0.10197162628173828
torch infer time: 0.10551595687866211
torch infer time: 0.10509729385375977
torch infer time: 0.10562729835510254
torch infer time: 0.1086127758026123
torch jit infer time: 0.5637478828430176
torch jit infer time: 0.3401811122894287
torch jit infer time: 0.3404266834259033
torch jit infer time: 0.322681188583374
torch jit infer time: 0.325390100479126
torch jit infer time: 0.3332507610321045
torch jit infer time: 0.33599114418029785
torch jit infer time: 0.33664798736572266
torch jit infer time: 0.3400590419769287
torch jit infer time: 0.33101773262023926
torch jit infer time: 0.3290140628814697
torch jit infer time: 0.3344461917877197
torch jit infer time: 0.3417356014251709
torch jit infer time: 0.33629822731018066
torch jit infer time: 0.3459322452545166

The model has many tensor creation ops, is it the reason?