Torch sparse activation function

Hello,

in my model, I would like to apply a non-linear function such as the ReLU module or the torch.nn.functional.relu function to a sparse tensor but get a
NotImplementedError.
Are there activation functions that I can apply to a sparse tensor?