What about BatchNorm1d (ref.)?
And of course, a LinearModule is simply an affine transformation, hence it requires vectors (1D) or stack of vectors, i.e. matrices (2D). ReLU() is a point-wise operator, so it should work with any input dimensionality.
No, we don’t plan to add any modules for reshaping, that could be put into a Sequential. You might want to take a look at how torchvision models are implemented.