Forward() takes 2 arguments but 3 were given

class structural_attention_layer(nn.Module):
def init(self):
super(structural_attention_layer,self).init()

    def forward(self,hello,world):
        return torch.randn(node_num,emb_size)

s_att=nn.Sequential()
s_att.add_module(name=“structural_attention_layer”,module=structural_attention_layer())
s_att(1,2)

error occurs like the title,why?

You need to provide detailed information such as the stack trace when the error occurs.
I think it could be the TypeError so that you maybe miss the positional arguments.

nn.Sequential container use a single input and output tensor as the input and output activation.
You could write a custom nn.Module for multiple inputs or check e.g. this topic for more information and potential workarounds.

yes,you are right.


a single input works,so if i need to use multiple inputs,can i pack them in a list and unpack them in the next layer?

i am sorry,i don’t state my question clearly,the whole code is showed below:

import torch.nn as nn
import torch
class structural_attention_layer(nn.Module):
    def init(self):
        super(structural_attention_layer,self).init()

    def forward(self,a,b):
        return torch.randn(a,b)
    
s_att=nn.Sequential()
s_att.add_module(name="structural_attention_layer",module=structural_attention_layer())
s_att(1,2)

Yes, this should be possible as long as you are using your custom layers which would then unpack the input in their forward method. Most of the PyTorch layers defined in the nn namespace would fail.