Forward() takes 2 positional arguments but 3 were given in case of predefined Transformer Decoder layer

I am training a model based on Transformer Decoder. The decoder layer i have taken from the predined pytorch website, But when i am giving the encoder output and the input to the decoder i am gettin a ’ forward() takes 2 positional arguments but 3 were given’.

class Decoder(nn.Module):
def init(
self,
in_channels:int=1,
patch_size:int=16,
num_transformer_layers:int=6,
embedding_dim:int=768,
mlp_size:int=3072,
num_heads:int=12,
attn_dropout:int=0.1,
mlp_dropout:int=0.1,
embedding_dropout:int=0.1,
out_features:int=512
):
super().init()
self.DecoderPrenet = Prenet(input_size=embedding_dim, output_size=out_features2,
hidden_size=embedding_dim)
self.embedding = PatchEmbedding(in_channels=in_channels,
patch_size=patch_size,
embedding_dim=embedding_dim)
self.pe = PositionalEncoding(d_model=768)
self.embedding_dropout = nn.Dropout(p=embedding_dropout)
self.mel_linear = nn.Linear(embedding_dim, hp.num_mels * hp.outputs_per_step)
self.transformer_decoder = nn.Sequential(
[nn.TransformerDecoderLayer(d_model=embedding_dim,
nhead=num_heads,
dim_feedforward=mlp_size,
dropout=mlp_dropout, activation=‘gelu’,
batch_first=True, norm_first=True) for _ in range(num_transformer_layers)])
self.PostConvNet = PostConvNet(num_hidden=embedding_dim)
self.encoder = SpeechtoText()

def forward(self, x, src):
batch_size = 16
memory = self.encoder(src)
x = self.embedding(x)
x = self.DecoderPrenet(x)
x = self.pe(x)
expand = x.expand(batch_size, -1, -1)
x = self.embedding_dropout(expand)
x = self.transformer_decoder(x, memory)
mel_out = self.mel_linear(x)
y = torch.flatten(mel_out, start_dim=0, end_dim=1)
postnet_input = y.transpose(0, 1)
postconvnet = self.PostConvNet(postnet_input)
out = postnet_input + out
out = out.transpose(1, 0)
k = out.shape[1]
out = torch.reshape(out, (batch_size, m, k))
return out

i tried using torchsummary. I am still receiving the same issue. Can anyone help me with this

Is this a double post from here from another account?

Hi @ptrblck yes sorry my primary work account was having some problems so i had to repost the question from another account. Sorry for the inconvenience. Can you kindly delete the question as i am unable to do it from my end