Torch.jit.trace works, but torch.jit.script does not

Is it expected that for some module torch.jit.trace works well, while torch.jit.script fails. In particular, I get an error, related to TorchScript treating all objects as Tensors by default, i.e.

Tensor cannot be used as a tuple:

Yes, this is expected.

torch.jit.trace constructs a JIT IR graph by observing the operations that are performed on Tensors by the Module being traced with little concern for the Python language constructs that are used to express these operations. This means that it can produce JIT graphs successfully for a wide variety of programs, but those graphs may not always be faithful representations of the corresponding programs. For example, tracing is unable to capture control flow because it observes tensor operations and runtime, and that means the graph it constructs will contain one branch of a given instance of if statement but not the other.

torch.jit.script produces a JIT graph from a program by compiling it. However, it places a number of restrictions on the programs it can compile, such as requiring that they be statically typed (through heavy use of type annotations) and use only a specified subset of Python language features.

To answer your question, torch.jit.script assumes any argument without an annotated type is a Tensor. To be able to script your Module, you will need to add type annotations to all methods and attributes and to make sure you are not using any unsupported Python language features.