I’ve got a DataLoader loading a custom dataset.
loader = torch.utils.data.DataLoader(train_set, batch_size=batch_size, sampler=train_sampler,
pin_memory=(torch.cuda.is_available()), num_workers=0)
I want to get rid of the Variable and use torch.tensor properly and without extra copying. An example of my training epoch looks like:
for batch_idx, (input, target) in enumerate(loader):
# Create vaiables if torch.cuda.is_available(): input_var = torch.autograd.Variable(input.cuda(async=True)) target_var = torch.autograd.Variable(target.cuda(async=True)) else: input_var = torch.autograd.Variable(input) target_var = torch.autograd.Variable(target) # compute output output = model(input_var) loss = torch.nn.functional.mse_loss(output, target_var)
With data from a DataLoader, do I change torch.autograd.Variable(input.cuda(async=True))
to torch.from_numpy(input).cuda(async=True)
to prevent copying?
Also, when I test an epoch I have
input_var = torch.autograd.Variable(input.cuda(async=True), volatile=True)
target_var = torch.autograd.Variable(target.cuda(async=True), volatile=True)
What do I replace volatile with?
Cheers