How to use legacy layers in PyTorch?

Hello,

I’m translating from Torch to PyTorch. There are some legacy layers I don’t know what to do with:

        in_dim = in_dim + 1 -- for NIL padding
        g_opts.encoder_lut_nil = in_dim
        local m = nn.LookupTable(in_dim, hidsz)
        g_modules['encoder_lut'] = m
        local s = nn.Sequential()
        s:add(m)
        s:add(nn.Sum(2))
        s:add(nn.Add(hidsz)) -- bias
        g_modules['encoder_sum'] = s.modules[3]
        return s```

I have tried:


            in_dim = in_dim + 1 # for NIL padding
            self.opts['encoder_lut_nil'] = in_dim
            m = nn.Embedding(in_dim, hidsz) # in_dim agents, returns (batchsz, x, hidsz)
            # g_modules['encoder_lut'] = m
            s = nn.Sequential(
                m,
                Sum(2),
                Add(hidsz) #bias
            )

But Sum and Add from legacy.nn are not subclasses of Module class:

TypeError: torch.legacy.nn.Sum.Sum is not a Module subclass

Does the following work?

class Encoder(nn.Module):
    def __init__(self, in_dim, hidsz):
        super(Encoder, self).__init__()
        self.lut = nn.Embedding(in_dim, hidsz) # in_dim agents, returns (batchsz, x, hidsz)
        self.bias = nn.Parameter(torch.randn(hidsz))

    def forward(self, input):
        x = self.lut(input)
        x = torch.sum(x, 3) # XXX: the original version is sum(2) but lua is 1-indexed
        x = x.add(self.bias) # XXX:
        return x