I am getting started with pytorch by browsing code and in a lot of example code i see magic numbers used away without explanation or comments. What do these numbers mean. I come from a C/C++ background where it is considered a horrible practice to ever use numbers in the code, preferring #defines on top to explain the values. But these numbers do not seem specific for the project or an author idiosyncrasy and seems more like a pytorch norm.
For example, In the pytorch tutorial page the parameters to the Normalise function,
transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])
Other examples are permute and view functions in one of the authors on github.
im = cv2.imread(dirpath + '/' + file) im = torch.Tensor(im).permute(2, 0, 1).view(1, 3, 224, 224).double() model.eval() im -= torch.Tensor(np.array([129.1863, 104.7624, 93.5940])).double().view(1, 3, 1, 1)
Thanks for the help!