torch.sparse.FloatTensor does not work with a simple linear layer (nn.Linear(input_size,10)) -RuntimeError: unsqueeze is not implemented for type torch.sparse.FloatTensor

Hi,

When I am trying to use sparse matrix as an input to a simple linear layer I am getting the following error: RuntimeError: unsqueeze is not implemented for type torch.sparse.FloatTensor.
Below there is a simple code that reproduce the error.
Thank you for you help,
Ortal

import torch
import torch.nn as nn
from torch.autograd import Variable

class ScorerModule(nn.Module):
def init(self):
super(ScorerModule, self).init()
self.LinearLayer = nn.Linear(10, 1)
def forward(self, x):
return self.LinearLayer(x)

module = ScorerModule()
optimizer = torch.optim.SGD(module.parameters(), lr=0.01)
criterion = nn.L1Loss()

i = torch.LongTensor([[0, 1, 9]])
v = torch.FloatTensor([3, 4, 5])
s =Variable(torch.sparse.FloatTensor(i, v, torch.Size([10])))

outputs = module(s)

Traceback (most recent call last):
File “”, line 20, in
File “/usr/local/lib/python3.5/site-packages/torch/nn/modules/module.py”, line 357, in call
result = self.forward(*input, **kwargs)
File “”, line 10, in forward
File “/usr/local/lib/python3.5/site-packages/torch/nn/modules/module.py”, line 357, in call
result = self.forward(*input, **kwargs)
File “/usr/local/lib/python3.5/site-packages/torch/nn/modules/linear.py”, line 55, in forward
return F.linear(input, self.weight, self.bias)
File “/usr/local/lib/python3.5/site-packages/torch/nn/functional.py”, line 837, in linear
output = input.matmul(weight.t())
File “/usr/local/lib/python3.5/site-packages/torch/autograd/variable.py”, line 386, in matmul
return torch.matmul(self, other)
File “/usr/local/lib/python3.5/site-packages/torch/functional.py”, line 169, in matmul
return torch.mm(tensor1.unsqueeze(0), tensor2).squeeze_(0)
RuntimeError: unsqueeze is not implemented for type torch.sparse.FloatTensor

Hi,

This is expected behaviour.
Currently, sparse x dense operations are not supported, You can see this issue for example corresponding to it.