Batchnorm1d for evaluation

In my code, I’ve used nn.batchnorm1d(64) as one of the layers where 64 is the batch size. Now after training my net I want to test the result for a single sample but it shows a error saying that I need to pass 64 samples. Is there any workaround to test the net with single sample?

The in_features argument you are passing defines the number of channels C in a [N, C, L] input or the sequence length L in an input with the shape [N, L].
It does not specify the batch size N.

For evaluation and testing you should call model.eval() which will apply the running stats to the samples and which will work with single samples as well.

Hey @ptrblck, I’m using an RNN encoder-deocder model and at the decoder side I have few linear layers. I’m using batchnorm in between these layer. My input to batchnorm1d layer is of the shape (seq_len, batch, hidden_size) i.e [1, 64, 256].
So what should I pass to batchnorm1d().
I tried passing 64 but after training the net when I’m passing a single sample it shows the error ‘running_mean should contain 1 element not 64’.
Please help me here.

I think you would have to permute the activation from [seq, batch_size, features] to [batch_size, features, seq] via x = x.permute(1, 2, 0) and pass it to the batch norm layer.
Currently you are using the batch dimension to normalize the activations, which won’t work.

Alternatively, you could also set batch_first=True in your RNN, which should return the tensor as [batch_size, seq, features] and permute the last two dimensions,

1 Like

Thanks for giving your time. Now I understand it. One more question though
My batch norm layer is in between linear layers and all of them are stacked in nn.Sequential(). Is there a way to permute the input inside nn.Sequential() because if not then I have to separate out the batch norm layer which makes the code little messy.

If the batch norm layers are between linear layers, the shape should most likely once be changed before the first linear layer.

Anyway, you could define a Permute layer and use it inside your nn.Sequential container:

class Permute(nn.Module):
    def __init__(self, dims):
        super(Permute, self).__init__()
        self.dims = dims
        
    def forward(self, x):
        x = x.permute(self.dims).contiguous()
        return x