Confused about Torch.Jit.Script error message

Hello, good morning. I am relatively new to Pytorch but liking it a lot! I have a BERT classification model that I am able to save, load, and run inference on locally, additionally the model is trained using nvidia CUDA GPU. Torch version is 1.10.0+cu102 on Windows 10 64 bit .Up to this point the model loads fine
model.load_state_dict(torch.load(‘MyModelPath/MyModel15.model’, map_location=torch.device(‘cpu’)))

Here is where the problem starts for me

scripted_model = Torch.Jit.Script(model)

throws this error message

Traceback (most recent call last):
File “C:\Users\javedh\lib\site-packages\IPython\core\”, line 3457, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File “”, line 1, in
scripted_model = torch.jit.script(model)
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\”, line 1257, in script
return torch.jit._recursive.create_script_module(
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\”, line 451, in create_script_module
return create_script_module_impl(nn_module, concrete_type, stubs_fn)
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\”, line 464, in create_script_module_impl
property_stubs = get_property_stubs(nn_module)
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\”, line 778, in get_property_stubs
properties_asts = get_class_properties(module_ty, self_name=“RecursiveScriptModule”)
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\jit\”, line 161, in get_class_properties
getter = get_jit_def(prop[1].fget, f"__{prop[0]}_getter", self_name=self_name)
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\jit\”, line 264, in get_jit_def
return build_def(parsed_def.ctx, fn_def, type_line, def_name, self_name=self_name, pdt_arg_types=pdt_arg_types)
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\jit\”, line 315, in build_def
build_stmts(ctx, body))
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\jit\”, line 137, in build_stmts
stmts = [build_stmt(ctx, s) for s in stmts]
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\jit\”, line 137, in
stmts = [build_stmt(ctx, s) for s in stmts]
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\jit\”, line 287, in call
return method(ctx, node)
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\jit\”, line 550, in build_Return
return Return(r, None if stmt.value is None else build_expr(ctx, stmt.value))
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\jit\”, line 287, in call
return method(ctx, node)
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\jit\”, line 702, in build_Call
args = [build_expr(ctx, py_arg) for py_arg in expr.args]
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\jit\”, line 702, in
args = [build_expr(ctx, py_arg) for py_arg in expr.args]
File “C:\Users\javedh\AppData\Roaming\Python\Python39\site-packages\torch\jit\”, line 286, in call
raise UnsupportedNodeError(ctx, node)
torch.jit.frontend.UnsupportedNodeError: GeneratorExp aren’t supported:
File “C:\Users\javedh\lib\site-packages\transformers\”, line 987
return any(hasattr(m, “gradient_checkpointing”) and m.gradient_checkpointing for m in self.modules())

I am not sure how to make sense of it in a meaningful way, it seems like a package support issue, but I could be wrong. I would appreciate any assistance in understanding this issue.

Thank you very much!

The error points to:

torch.jit.frontend.UnsupportedNodeError: GeneratorExp aren’t supported

which is raised due to the used generator expression as seen in this minimal code snippet:

class MyModel(nn.Module):
    def __init__(self):
    def forward(self, x):
        a = any(i for i in range(10))
        return x

model = MyModel()
x = torch.randn(1, 1)
out = model(x)

scripted = torch.jit.script(model)
# > UnsupportedNodeError: GeneratorExp aren't supported
1 Like

thank you so much ptrlck, any recommendations on how I can address this issue?

thanks again