Where is the `grad` defined?

Continuing the discussion from How can I get the gradients of the weights of each layer?:

Hi ptrblck @ptrblck,

I was trying to find the attribute definition of grad in both nn.Linear and nn.Conv2d class, but all I find is the PairwiseDistance function defined in the PariwiseDistance.py as below:

class PairwiseDistance(Module):

    def __init__(self, p):
        super(PairwiseDistance, self).__init__()
        assert p % 1 == 0
        self.gradInput = []
        self.diff = torch.Tensor()
        self.norm = p

        self.outExpand = None
        self.grad = None
        self.ones = None
...

I wonder is it where the grad in the model.my_layer.weight.grad defined? But it seems to me neither of them inherits PairwiseDistance? Thank you.

Modules don’t have a grad definition, but parameters or tensors.
You can find the attribute/property definition here. tensor.grad will be populated e.g. after a backward operation or torch.autograd.grad call and will give you the gradient of X w.r.t. this tensor.