Variable with requires_grad=False

I am following someone else’s codebase for personal research purposes.
While going through the code I came across several code snippets such as:

target_var = Variable(target_vec, requires_grad=False)

Variable(last_acc[name], requires_grad=False).view(-1)

I know that Variable has been deprecated and we should be using Tensors now. What doesn’t make sense to me here is that wasn’t Variable supposed to be used if gradients had to be calculated?

What’s the point of using “Variable” and then setting requires_grad=False ?
That seems redundant to me.

In tensor lingo, the declaration should be like this right?

target_var = torch.Tensor(target_vec, requires_grad=False) 

Before Variable and Tensor were merged, you could only do computation between tensors or between variables, but not mix them.

We don’t use Tensor constructors anymore but only torch.tensor factory methods.

Best regards


1 Like

Sorry if I sound like a noob but I don’t understand what that implies. Do you mean I cannot use torch.Tensor() to create a new tensor from another one?