Nn modules to nngraph modules


In lua for nngraph we use the extra bracket followed by nn module as follow

input = nn.Identity()()

the input variable show as nngraph.Node.

How can I will get the same thing in pytorch ?

In lua when we add next module or layer the previous module or layer it take as input. like

newBlock = nn.Linear(512, 0)(input)

Please let me know some similar library in pytorch.


While the PyTorch code might run successfully, the parameters of the layer (weight and bias) will be randomly initialized in each call.
Most likely you would like to use this coding style for modules without any parameters or buffers, e.g. nn.ReLU(). However, I would then recommend to just use the functional API: output = F.relu(input).