Is [] indexing in pytorch integrated in autograd?

Hello. I read the pytorch implementation of fcn from https://github.com/wkentaro/pytorch-fcn. In the model definition file i read these lines:
’’'
def forward(self, x):
h = x
h = self.relu1_1(self.conv1_1(h))
h = self.relu1_2(self.conv1_2(h))
h = self.pool1(h)

    h = self.relu2_1(self.conv2_1(h))
    h = self.relu2_2(self.conv2_2(h))
    h = self.pool2(h)

    h = self.relu3_1(self.conv3_1(h))
    h = self.relu3_2(self.conv3_2(h))
    h = self.relu3_3(self.conv3_3(h))
    h = self.pool3(h)

    h = self.relu4_1(self.conv4_1(h))
    h = self.relu4_2(self.conv4_2(h))
    h = self.relu4_3(self.conv4_3(h))
    h = self.pool4(h)

    h = self.relu5_1(self.conv5_1(h))
    h = self.relu5_2(self.conv5_2(h))
    h = self.relu5_3(self.conv5_3(h))
    h = self.pool5(h)

    h = self.relu6(self.fc6(h))
    h = self.drop6(h)

    h = self.relu7(self.fc7(h))
    h = self.drop7(h)

    h = self.score_fr(h)

    h = self.upscore(h)
    h = h[:, :, 19:19 + x.size()[2], 19:19 + x.size()[3]].contiguous()

    return h

’‘’
In the second line from the bottom,
h = h[:, :, 19:19 + x.size()[2], 19:19 + x.size()[3]].contiguous()

Is the [] indexing can be autograd? and contiguous() will not affect the autograd mechanism?
Thanks!

Yes, the indexing works with autograd, and the contiguous call works without problem.