Tracing a model with control flow in forward operation

I am tracing a model to export it to c++ via the command:

torch.jit.trace(model.eval(), example_inp).

Does the tracing done as above still work if I include in the forward pass of the model something like the following code snippet?

def forward(self,x):
if self.eval:
return x + 5
else:
return x

Or I have to go through annotation to export the model even with this minimal control flow?

torch.jit.trace will not record any control-flow as described in the docs.
Your condition in your posted code is wrong since you are comparing against a function:

model.eval
<bound method Module.eval of MyModel()>

which will always return true.
Use self.training instead as this attribute will be changed according to model.train()/.eval().
After this fix, the traced model will stick to the recorded code path.

Thanks a lot for the correction, I am not sure I got the last part of your response though. Using the following code snippet is the recorded code path going to be the one in the else part of the forward?

Def forward(self,x):
If self.training:
Return x
Else:
Return x+ 5

Torch.jit.trace(model.eval(), example_inp)

The traced model would take the else path even if you want to switch back to training mode:

class MyModel(nn.Module):
    def __init__(self):
        super().__init__()
    
    def forward(self,x):
        if self.training:
            return x
        else:
            return x + 5

x = torch.ones(10)
model = MyModel()

out = model(x)
print(out)
# tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1.])

model.eval()
out = model(x)
print(out)
# tensor([6., 6., 6., 6., 6., 6., 6., 6., 6., 6.])

# reset
model.train()
out = model(x)
print(out)
# tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1.])

model_traced = torch.jit.trace(model.eval(), x)

print(model_traced(x))
# tensor([6., 6., 6., 6., 6., 6., 6., 6., 6., 6.])

# does not change the behavior
model_traced.train()
print(model_traced(x))
# tensor([6., 6., 6., 6., 6., 6., 6., 6., 6., 6.])

ok now I understand, thank you a lot!