Maxout layer in pytorch

I want to us maxout activation in pytorch, and I use torch.max() function to implement.

class Maxout(nn.Module):
    def __init__(self):
        super(Maxout, self).__init__()
    def forward(self, x, y):
        return x.max(y)

Is it right?

I also have the same question. In my case, I use the code below. Can anyone help?

class Maxout(nn.Module):
         def __init__(self):
              super(Maxout, self).__init__()
        def forward(self, x, y):
              return torch.max(x,y)```

You can define a layer like that but it’s not necesary, you can call torch max in forward and it will do the same.

Check this: maxout-layer