Batchnorm/Layernorm

Instead of normalizing the data yourself is it possible to just put a layernorm or batchnorm layer in front of your nn and it normalizes the data over the last 100 data points? And lets say i have as input 6 features and i want to normalize over last 100 inputs how to set up the layernorm/batchnorm? How would it look in code?