(myerror)TypeError: forward() takes 2 positional arguments but 3 were given

import torch.nn as nn
import torch
class structural_attention_layer(nn.Module):
    def init(self):
        super(structural_attention_layer,self).init()

    def forward(self,a,b):
        return torch.randn(a,b)
    
s_att=nn.Sequential()
s_att.add_module(name="structural_attention_layer",module=structural_attention_layer())
s_att(1,2)

Is this a double post from here with an already provided solution or what would be the difference in this code snippet?