Torchscript, Using ModuleList and checking module type results in error

Hi All,

Trying to use Torchscript to convert my model and running into an issue, would like to know what best way forward is.

Say you have nn.Module as follows:

class Model(nn.Module):
    def __init__(self, hidden_size, input_size, num_layers, dropout):
        super(Model, self).__init__()
        self.hidden_size = hidden_size
        self.input_size = input_size
        self.num_layers = num_layers
        self.dropout = dropout

        modules = []
        for layer in range(self.layers):
            lstm = nn.LSTM(self.input_size if layer == 0 else self.hidden_size * 2, self.hidden_size, bidirectional=True, batch_first=True)
            modules.append(lstm)
            if layer != 0:
                linear = nn.Linear(self.hidden_size * 2)
                modules.append(linear)
            if layer != self.layers - 1:
                dropout = nn.Dropout(p=self.dropout)
                modules.append(dropout)
        self.modulelist = nn.ModuleList(modules)

    def forward(self, X):
        for module in self.modulelist:
          if ".LSTM" in str(type(module)):
              pack_sequence(...)
              # Do something ...
              pack_sequence(...)
          elif ".Linear" in str(type(module)):
              # Do something else ...
          elif ".Dropout" in str(type(module)):
              # Do something else ...
          return X

Resulting in a ModuleList with layer structure:

  LSTM -> Dropout -> LSTM -> Linear -> Dropout -> LSTM -> Linear -> Dropout ->  LSTM -> Linear

I get an error when calling scripted_module = torch.jit.script(Model(hidden_size=256, input_size=512, layers=4, dropout=0.3)) when trying to check the type of the module when iterating through the ModuleList during the forward pass.

Is there a way to circumvent this issue i.e. is this kind of forward-pass supported by TorchScript? Thanks!