Could you please guide me on how to use batchnorm with FC layers as in my case it gives the same output value for different inputs?
If you add a BatchNorm
layer into your model, your output will always be the same, even if you use different inputs?
Could you provide a small working example?
In general, you just have to add a BatchNorm
layer between your linear layers:
model = nn.Sequential(
nn.Linear(10, 20),
nn.BatchNorm1d(20),
nn.Linear(20, 2)
)
batch_size = 20
x = torch.randn(batch_size, 10)
output = model(x)
print(output)
Found my mistake, anyways thanks for your effort.