Hello all! I am currently working with Kronecker products which lead to incredible amounts of sparsity and dense tensors are not feasible currently as they spill from my vram. Moreover, I am not working with neural nets, I am trying to do kernel matrix computation for SVM’s so doing things in batches is not the most straightforward. Thus I am turning to sparse tensors. What I would like to do is given some sparse tensor, delete rows based on a mask vector: v = [1 0 1 0 0 0], where if there is a 1 in ith entry in v, then the ith row of my sparse tensor is made into 0’s. I tried just doing sparse_tensor[v], but this gave an error saying that this doesn’t work on sparse tensors. Is there any work around without looping through the rows?
Was able to fix this by using the sparse kronecker code here: Kron with gradient for sparse tensors · Issue #134069 · pytorch/pytorch · GitHub
For deleting rows I did this:
allowed_indices = your_allowed_indices
keep_mask = torch.isin(new_indices[0], allowed_indices)
new_values = new_values[keep_mask]
keep_mask = keep_mask.expand(2,new_indices.size()[1])
new_indices = new_indices[keep_mask].view(2, -1)
This deleted both the indices and the values of the resulting sparse kronecker product.