Create adaptation of the softmax function

I am pretty new to pytorch and currently looking into the softmax function, I would like to adapt the orignally implemented for some small tests.

I have been to the docs but there wasn’t that much of usefull information about the implementation itself.

def __init__(self, dim=None):
    super(Softmax, self).__init__()
    self.dim = dim

def __setstate__(self, state):
    self.__dict__.update(state)
    if not hasattr(self, 'dim'):
        self.dim = None

def forward(self, input):
    return F.softmax(input, self.dim, _stacklevel=5)

Where can I find the F.softmax impementation? This is probably a C implementation?

One off the things I want to try for instance is the soft-margin softmax described here: Soft-Margin Softmax for Deep Classification

Is the C implementation the best place to start, or would it be easier to write it in python?
Where would be the best place to start?

Thanks in advance!