How to enable the dynamic length of the output of a common MLP networks

Dear all, could someone help me with the following issue?
For a common multi-layer neural networks, for the final layer, the output length differs from batch to batch, how can I implement a multi-layer neural networks like this? Is it possible with PyTorch?