How to use torch.onnx.export with customed input datatype, like SparseTensor?

In this repo torchsparse, there is a customed datatype SparseTensor`.

class SparseTensor:
    def __init__(self, feats, coords, cur_tensor_stride=1):
        self.F = feats
        self.C = coords
        self.s = cur_tensor_stride
        self.coord_maps = {}
        self.kernel_maps = {}

    def check(self):
        if self.s not in self.coord_maps:
            self.coord_maps[self.s] = self.C

    def cuda(self):
        assert type(self.F) == torch.Tensor
        assert type(self.C) == torch.Tensor
        self.F = self.F.cuda()
        self.C = self.C.cuda()
        return self

    def detach(self):
        assert type(self.F) == torch.Tensor
        assert type(self.C) == torch.Tensor
        self.F = self.F.detach()
        self.C = self.C.detach()
        return self

    def to(self, device, non_blocking=True):
        assert type(self.F) == torch.Tensor
        assert type(self.C) == torch.Tensor
        self.F = self.F.to(device, non_blocking=non_blocking)
        self.C = self.C.to(device, non_blocking=non_blocking)
        return self

    def __add__(self, other):
        tensor = SparseTensor(self.F + other.F, self.C, self.s)
        tensor.coord_maps = self.coord_maps
        tensor.kernel_maps = self.kernel_maps
        return tensor

And I want to export to ONNX model, but when I ran torch.onnx.export, I got this ERROR:

RuntimeError: Only tuples, lists and Variables supported as JIT inputs/outputs. 
Dictionaries and strings are also accepted but their usage is not recommended. 
But got unsupported type SparseTensor

This problem may be same to other custome data types.

I also noticed this line in torch.onnx.init.py
What do you mean by this ?

Any non-Tensor arguments (including None) will be hard-coded into the exported model

Thanks in advance for any help!