I loaded the dataset using data loader as follows:
data_loader = torchvision.datasets.ImageFolder('/content/drive/My Drive/Dataset/malimg_paper_dataset_imgs',
transform = torchvision.transforms.Compose([
torchvision.transforms.Resize((224, 224)),
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(mean=[0.485, 0.456, 0.406],
std=[0.229, 0.224, 0.225])]))
The original dataset’s size is 1.1GB but in the data loader, I have applied resizing and normalization, now I want to know what will be the size of data that is loaded into the data loader. I can’t find anything related in the documentations. Thanks