Counterpart of nn.Max(dim) in pytorch

In torch. we can use nn.Max to do max along some dimensions of input. However, in pytorch, how to achieve this? can anyone give some tips

Problems solved. I find it in legacy folders in stead of the torch/nn folders in pytorch.

you can simply use torch.max. You dont need a separate nn.Max layer in PyTorch.

thank you very much for your advice, i find that wrapping legacy.nn.Max into torch.nn.Sequential often cause error below:

typeError: torch.legacy.nn.some_module.some_module is not a Module subclass

So if torch.legacy.nn is not compitible with pytorch regular modules, what’s the point of the torch.legacy.nn, if not, can you give me some template code on how to use them. Much obliged if you do.

suppose i want to implement a model which views input of size (bz * 12) x C x H x W to be of size
bz x 12 x C x H x W, and then max pool it along the second dimension, resulting a tensor of size: bz x C x H x W, so below is the code i implementated:

class View_And_Pool(nn.Module):  
    def __init__(self): 
        super(View_And_Pool, self).__init__() 
        # not that in python, dimension idx starts from 1
        self.Pool_Net =  legacy_nn.Max(1)
        # only max pool layer, we will use view in forward function

        # may add softmax layer ??  
    def forward(self, x): 
        # view x ( (bz*12) x C x H x W) ) as 
        # bz x 12 x C x H x W 
        x = x.view(-1, 12, x.size()[1], x.size()[2], x.size()[3])
        # legacy nn is not callable, so we call updateOutput
        x = self.Pool_Net.updateOutput(x.data)
        # must wrap it 
        return Variable(x) 

The above code works okay for forward, i am not sure whether the backward also works okay.
any advice?

sorry, the above code can not do backward operations. maybe i should use torch.max instead.