Hello,
I’m translating from Torch to PyTorch. There are some legacy layers I don’t know what to do with:
in_dim = in_dim + 1 -- for NIL padding
g_opts.encoder_lut_nil = in_dim
local m = nn.LookupTable(in_dim, hidsz)
g_modules['encoder_lut'] = m
local s = nn.Sequential()
s:add(m)
s:add(nn.Sum(2))
s:add(nn.Add(hidsz)) -- bias
g_modules['encoder_sum'] = s.modules[3]
return s```
I have tried:
in_dim = in_dim + 1 # for NIL padding
self.opts['encoder_lut_nil'] = in_dim
m = nn.Embedding(in_dim, hidsz) # in_dim agents, returns (batchsz, x, hidsz)
# g_modules['encoder_lut'] = m
s = nn.Sequential(
m,
Sum(2),
Add(hidsz) #bias
)
But Sum and Add from legacy.nn are not subclasses of Module class:
TypeError: torch.legacy.nn.Sum.Sum is not a Module subclass