BatchNorm-like module that divides batch data by std but not subtract by mean

Is there a Pytorch function/module to only divide a batch data by std, but not subtract by mean?
Also, what is this operation called? Standardization?

I’m not sure if you would like to apply this normalization of each sample or somewhere inside the model.
In the former case, use torchvision.transforms.Normalize and set the mean to zeros.
In the latter case, just divide by torch.std(x) (and specify the dimensions if needed).

Hi, thanks for the prompt reply!
I’m using this to replace BatchNorm2D, so would need a running std estimation. Your mentioned approaches don’t quite fit. I’ve edited the title accordingly.
I can adapt your manual implementation given here, but there’s a performance (compute/memory) degradation compared to native BatchNorm2D.