Could you make sure you are using the posted code snippet and do not forget to create an instance of nn.ReLU
?
This code will yield your error:
layer1=nn.Sequential(
nn.Conv2d(in_channels=3, out_channels=16, kernel_size=3, stride=1, padding=1),
nn.ReLU, # missing () !!!
nn.BatchNorm2d(16),
)
while your posted code should work.