Has torch.autograd.Variable been replaced?

Hi, I’m just wondering is torch.autograd.Variable replaced by torch.tensor (…, Requires_grad= True)?

and why when I use this code

t = torch.from_numpy(matrix_np).clone()
matrix_torch =torch.tensor(t, requires_grad=True)

it throws this error
matrix_torch =torch.tensor(t.clone(), requires_grad=True)
main:1: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).

what is the best way to make the Numpy array a tensor to use it later to backprop. ?


The idea is that every Tensor is a Variable now.
In your case, you can do: matrix_torch = torch.from_numpy(matrix_np).clone().requires_grad_().

Thanks @albanD
Do I really need .clone() here if so why?

No you don’t really need the clone.
Note that when you do t = torch.from_numpy(matrix_np) then t and matrix_np actually share memory. So modifying matrix_np inplace would modify t as well. So you might want to actually clone to avoid any issue.