How to convert the following batch normalization layer from Tensorflow to Pytorch?
tf.contrib.layers.batch_norm(inputs=x,
decay=0.95,
center=True,
scale=True,
is_training=(mode=='train'),
updates_collections=None,
reuse=reuse,
scope=(name+'batch_norm'))
I couldn’t find some of the following inputs in the batchnorm layer in Pytorch.