Automatic differentiation with custom layer

Dear PyTorch users,

I am relatively new to PyTorch and trying to implement a custom layer.

  1. My forward() has a set of non-trivial set of operations and I am wondering whether the gradient will be automatically computed or do I have to implement a new function that returns these gradients, (through backward())?

Here is my custom layer,

class SNE(nn.module):
    def __init__(self, input_features, output_features, kmeans):
        super(SNE, self).__init__()
        self.input_features = input_features
        self.output_features = output_features

        # Register Parameters
        self.weight = nn.Parameter(torch.Tensor(output_features, input_features))

        # Initialize wts from the kmeans algorithm = kmeans

    def forward(self, input):

        output = 1 + torch.cdist(input, self.weight) # Compute cdist
        output = 1/ output # Invert

        for j in np.arange(output.size()[0]):
            output[j] = output[j]/output[j].sum() #Normalize
     return output

An alternative would be just call a new_func() (inherited from Function) in the forward() above with a forward() and backward() implementation. But then, my backward() will look a bit messy and likely to be inefficient or error prone.

  1. My init() is taken directly from the template given HERE. Also, I do not really understand what is going on with output_features in this template, would someone minding explaining me this?

Thanks, in advance.