Hi all,
Here i am trying to improve accuracy of MNIST dataset to 99.14%.So i came across nn.BatchNorm2d() which is a regularization technique to improve accuracy by decreasing noises.But in the below code while downloading MNIST datset i applied a function in torch vision named transforms.Normalize((0.1307,), (0.3081,))
train_loader = torch.utils.data.DataLoader(
datasets.MNIST('../data', train=True, download=True,
transform=transforms.Compose([
transforms.ToTensor(),
transforms.Normalize((0.1307,), (0.3081,))
])),
Here in last line i am normalising each tensor.
- so what is the difference between both of them??
2)should i use nn.BatchNorm2d() or it is equal to transforms.Normalize().
3)So eventhough i use it does it make my image much darker or not??