I want to implement a batchnrom1d which should use a single (scalar) gamma, beta, mean and std rather than vectors. I want the layer to normalize the whole input mini-batch using its mean and variance and similarly multiply the normalized input to scalar gamma and add beta. All parameters of this custom batchnorm1d layer will be scalars rather than vectors.
Just like batchnorm layer of lasagne offers a parameter named ‘axes’ which is use to achieve the above functionality. I was wondering if there is already such functionality available in Pytorch or do I have to write my own batchnorm layer for this?