I am trying to use the ImageFolder
class to create a custom dataset. My root folder is organized as described in the docs: trainset/sequence[id]/[image_id].[extension]
I am applying a standard ToTensor
transformation, following which I am trying to apply a custom tranformation using the transforms.Lambda
class. Essentially, the lambda function I am passing as parameter should serve as a feature extractor when the tensor is fed through a deep CNN (say VGG16).
Here is the code I am using:
import torch
import torch.nn as nn
from torchvision.models import vgg16, vgg19
import torch.utils.data
import torchvision.transforms as transforms
import torchvision.datasets as datasets
'''initialize the architecture of choice and remove their last layers so that they
can be used as visual feature extractors'''
def build_feature_extractor(arch_name="vgg16"):
if arch_name.lower() == "vgg16":
return remove_last_layers(vgg16(pretrained=True))
if arch_name.lower() == "vgg19":
return remove_last_layers(vgg19(pretrained=True))
"""forward the image tensors through the model of choice and get the visual
features back"""
def get_features_batch(model, frames=None):
#make sure a valid set of frames have been passed
assert frames, "No frames were provided"
if torch.cuda.is_available():
model = model.cuda()
return model.forward(frames)
def feature_generator(data_dir, batch_size=32):
print("\nLoading data ... ")
feature_extractor = build_feature_extractor()
to_features = lambda x: get_features_batch(feature_extractor, x)
transformations = transforms.Compose([transforms.ToTensor(),
transforms.Lambda(to_features)])
images = datasets.ImageFolder(data_dir, transformations)
When I run this code, the images
variable gets created, but as soon as I try to access any of its constituent elements through images[i]
, I get the error mentioned in the title of this post.
I admit I have little familiarity with writing lambda functions and even less so with the transforms.Lambda
class, so I would greatly appreciate it if someone can help me figure out where I am making a mistake!