Linear on Sequential returns extra dimensions

As part of an assignment I constructed a simple MLP with 2 hidden, 1 output layer into a Sequential()

In one of the tests, it gets 1D vector [4] as input, but instead of returning [3] length output, it returns [4, 3]

What on earth causes this? The same construct returns 1D tensors for 1D tensors in earlier tests.

This is on pytorch.

self.a_net = nn.Sequential(
nn.Linear(n_node_features*2,96),
nn.ReLU(),
nn.Linear(96,96),
nn.ReLU(),
nn.Linear(96,n_edge_features)
)

Hi Markus!

Pytorch models always work on batches of samples. If you
wish to pass a single length-4 sample to a network, you have
to package it as a batch of length-4 samples with nBatch = 1.

That is, for this example, your input should have a shape of
[1, 4] (not just [4]). A pytorch model will interpret an input
of shape [4] as a batch of 4 single scalar samples, and then
will return a batch of 4 outputs having shape [4, nOutput].

I would, however, have expected you to get some sort of
dimension-mismatch error (assuming that your network
was expecting length-4 samples), rather than having the
network accept (a batch of 4) single scalar samples.

Best.

K. Frank