I am wanting to dump the data so that I can load it back for training my model.
My code snipped for dumping the data:
for batch_idx, (image, label) in enumerate(dataloader):
image, label = image.to(device), label.to(device)
perturbed_image = attack.perturb(image, label)
#---------- Classifier ----------
predict_A = classifier(perturbed_image)
pred_label = torch.max(predict_A.data, 1)[1]
if pred_label != label:
adv_data.append( (perturbed_image.to("cpu"), label.to("cpu")) )
def load_data(list_):
if len(list_[0])==2:
img, lab = [], []
for i in list_:
img.append(i[0]), lab.append(i[1])
xs = torch.stack(img)
xs = xs.squeeze(1)
ys = torch.Tensor(lab)
dataset = TensorDataset(xs, ys)
del img, lab
return dataset
with open(dir, "rb") as file:
train_list = pickle.load(file)
train_set_1 = load_data(train_list)
train_loader = torch.utils.data.DataLoader(train_set_1)
Is there any other way I can dump it correctly so as to load it in the torch.utils.data.DataLoader
.
This works well individually but I guess this is not exactly the way how a usual torchvision.dataset.CIFAR10 is stored and loaded. I want to concatenate this with torchvision.datasets.CIFAR10
in my train_loader.
i.e.,
concatenating this;
train_set_2 = torchvision.datasets.CIFAR10(root="data", train=True)
with train_set_1 above.