How can I define a new layer with autograd?

Hi, I am new to pytorch and also new to autograd.
I define a new layer. In this layer’s forward function, I create a new tensor and set the requires_grad=True, after a series of calculations, I return the tensor as my output. I want to know how can I write ‘a series of calculations’ to get a correct layer with autograd?
Here is my code, I know it can’t autograd. How can I fix it ? or I have no choice but write the backward function by myself.

class maskInstanceNorm2d(nn.Module):
    def __init__(self, num_feature, eps= 1e-5, momentum = 0.1, affine=False, track_running_stats = False):
      super(maskInstanceNorm, self).__init__()
        self.num_feature = num_feature
        self.norm = nn.InstanceNorm2d(num_feature, eps, momentum, affine, track_running_state)
    def forward(self, obs, mask):
     # obs is a tensor with shape(batch, k, x, y)
     # mask is a list of x and y's real length( we add padding)
     # I hope I can get the real instanceNorm output(remove the padding's influence)
        batchSize, feat, xLen, yLen = obs.size()
        output = torch.zeros(size = obs.size(), device = obs.device, required_grad=True)
        for batchNum in range(batchSize):
            xMask = mask[batchNum][0]
            yMask = mask[batchNum][1]
            Slice =, batchNum)[:, :xMask, :yMask].reshape(1, feat, xMask, yMask)
            output[batchNum, :, :xMask, :yMask] = self.norm(Slice)[0]
        return output