Sparse Semi-Structured Conv2d tensors?

Dear reader,

Is it possible with the Sparse Semi-Structured Tensors, or is this not yet supported?

Currently all the examples like: [(beta) Accelerating BERT with semi-structured (2:4) sparsity] ((beta) Accelerating BERT with semi-structured (2:4) sparsity — PyTorch Tutorials 2.3.0+cu121 documentation) talk about nn.Linear layer, and not about nn.Conv2d. Furthermore, according to: torch.sparse — PyTorch 2.1 documentation the tensors should be 2D, while a nn.Conv2d is usually 4D? Is it possible to prune on the channel dimension, just like nvidia is doing with their ASP (Automatic SParsity), Channel Permutations for N:M Sparsity

Could you give me advice how to apply Sparse Semi-Structured on Conv2d layers if possible, and if yes, provide an example, or should i still use Nvidia’s framework for this?