`num_features` argument in BatchNorm1d?

Hello,

the signature of BatchNorm1d() is torch.nn.BatchNorm1d(num_features, ...).

whereas

num_features – C from an expected input of size (N,C,L) or from input of size (N,L)

and

Input: (N,C) or (N,C,L)

Currently my input is a timeseries sampled at 2048 Hz and I take 1s of it i.e. I have 2048 numbers. I currently feed the network a tensor of size [1, 2048] and use

torch.nn.BatchNorm1d(1)

Now while the results are bad (but that’s probably because my NN is random/bad atm) I’m wondering if I use it correctly.

Of course, I have 2048 features and not 1 but then the docs say that the input has to be of the shape (N, C) and num_features is actually C meaning that if I put in a tensor of shape [1, 2048] I actually have num_features=2048.

Which confuses me, because I did set num_features=1. So the question arises: Why the need to set num_features anyway if it is deduced automatically from the input shape?

What’s going on here?

Also what is N, C, L to begin with?

I believe num_features in BatchNorm is the number of channels rather than time/spatial dimensions.

N - Batch size
C - Features / Channels, 1 in your case
L - Length (number of samples), 2048 in your case

You can definitely have more than 1 C, think about ECG where there are 12 leads and thus C=12.