How pytorch support python [] (__getitem__) ops backpropogation?

I would like to ask about a tensor. Its operation [] , i.e., __getitem__ , supports backpropagation when indexing or slicing. However, I do not know how this is supported. Is there an overload or some code that can show how this operation supports both forward and backward propagation? I would like to know how it is registered and how different backward codes are overloaded under different conditions. Thank you.

supplement:
like this, we can know its grad_fn is selectBackward

but i don’t know why this situation a[0][1][1] ,the ‘’ will be select op in pytorch , who
register it?

You showing python examples but what I can with my limited knowledge is it part of the Tensor structure/object(in opposite to the pure mathematical term) and yes they have their custom operation overloads so the tensor “saves” what has been done with it .

if you want to know more you need to dive deep into the code in github