From what I’ve read it appears that support for sparse tensors is still relatively unsupported, so I don’t have a ton of confidence that this is possible, but is it possible to construct a sparse tensor from a Variable? If this isn’t possible, how could I take a list of indices of length N, a list of values of length N, and put the nth value into the corresponding nth index of some new tensor?

Essentially I’m looking for an equivalent to the sparse tensor construction, but where the values can be Variable.

You can always call .data to get a tensor from a variable.

```
import torch
from torch.autograd import Variable
i = Variable(torch.LongTensor([[0, 1, 1, 2, 2],
[2, 0, 2, 1, 0]]))
v = Variable(torch.FloatTensor([3, 4, 5, 1, 2]))
x = Variable(torch.sparse.FloatTensor(i.data, v.data, torch.Size([3,3])))
```

There is better support for this on master right now, though, if building from source is an option.

I know that this is possible, but it doesn’t preserve the computation graph which is what I’m aiming for.