Reapply torchvision.transforms to downloaded data

Hi, I am trying to run a simple VAE model written in pytorch. To that end, I use the classical MNIST dataset by means of torchvision.datasets. Now, the data have been already downloaded and recorded in its raw values i.e., [0, 255]). Once I load it from the directory, the transforms defined to normalize the data into [-1, 1] with zero mean and unit variance, is not applied. Ergo, the data is in its original range [0, 255]. How could I do the scaling into [0,1] and the to reapply the transforms.Normalize? Here is the code:

transform_train = transforms.Compose([
    transforms.ToTensor(),
    transforms.Normalize(mean=(0.1307,), std=(0.3081,))
])

transform_test = transforms.Compose([
    transforms.ToTensor(),
    transforms.Normalize(mean=(0.1307,), std=(0.3081,))
])

use_cuda = torch.cuda.is_available()

trainset = datasets.MNIST(root='../../data/MNIST', train=True, download=False, transform=transform_train)
valset = datasets.MNIST(root='../../data/MNIST', train=False, download=False, transform=transform_test)

The provided code is normalizing the data and it’s not in its original range as seen here:

x, y = trainset[0]
print(x.min(), x.max())
> tensor(-0.4242) tensor(2.8215)

Are you sure this code snippet was returning the raw data?

The data I have, was downloaded without any transformation, and the values I see are in the range [0, 255], even after executing the code I put in the thread. Notice that I set the download flag to False since I have already the data locally recorded… If the data were in [0, 1] then I see that the mean and std are respectively, 0.1307 and 0.3081.

Your code unfortunately doesn’t show how you are checking the data stats. As you can see in my code snippet the min and max values show the transformed results, not the original data values.
In case you are checking the internal .data attribute, then note that this is the original data, where no transformations were applied. Also, feel free to post an executable code snippet to reproduce this issue.