Why some code is using "torch.autograd.Variable(inputs)"?

I see some pytorch training code using something like this:

for batch_idx, (inputs, targets) in enumerate(trainloader):
    inputs, targets = torch.autograd.Variable(inputs), torch.autograd.Variable(targets)
    ...
    pred = model(inputs)
    ...

why? becuase I think (inputs, targets) are already tensors? thus we can directly do like:

for batch_idx, (inputs, targets) in enumerate(trainloader):
    pred = model(inputs)
    ...

Is it because I misunderstand something or the first usage is just some old version of pytorch?

really thanks

Yes, from pytorch 0.4 onwards it’s no longer needed.