TorchScript: Using tensor populated with a different module

Hello,

What’s the best way to write TorchScript code which does this:

class S(torch.jit.ScriptModule):
    def __init__(self):
        self.tensor_constant = torch.ones(2)

    @torch.jit.script_method
    def forward(self):
        return self.tensor_constant + 2

S()

It fails with
attribute 'tensor_constant' of type 'Tensor' is not usable in a script method (Tensors must be added to a module as a buffer or parameter):

In other words, in TorchScript how can I use a tensor populated using a different module?

Thanks,
Omkar

There are 2 things wrong with the code

  1. You created a new class ‘S’ which subclassed torch.jit.ScriptModule. Now after creating the class S, you have to call it’s super-constructor i.e. you have to run the __init__() function of the class you are subclassing from (which in your case is init() function of torch.jit.ScriptModule). It is done with this code super().__init__()
  2. When creating torch scripts you have to use register buffers (as JIT compiles the module for you, so it must have info about everything)
class S(torch.jit.ScriptModule):
    def __init__(self):
        super().__init__()
        self.register_buffer('tensor_constant', torch.ones(2, dtype=torch.float))

    @torch.jit.script_method
    def forward(self):
        return self.tensor_constant + 2

S()

Ask for clasrifications.

1 Like

Thanks Kushaj. I think I can use register_buffer in my use case.

1 Like