as error suggests, tuple sizes are static(hardcoded) when “compiled”, so you have to do something like
return (result[0],result[1])
and there may be a need to annotate this as Tuple[Tensor,Tensor] in some contexts
if you’re comparing to C++, List is like a vector<Tensor> of dynamic size, hence you cannot initialize specialized N-tuples from it, without runtime checks and metaprogramming anyway.
Though I agree that TorchScript is lacking in some aspects, in this case the workaround is to return a list.
from torch import Tensor
import torch
def some_rotation_op(rotation:Tensor):
[Rxz, Ryz, Rzz] = rotation[:, -1] # <------ whether this is a list makes no difference
...
return Rxz
torch.jit.script(some_rotation_op, (torch.rand( 3, 3),))