BackProp gradient flow in resnet model

In the provided implementation of Resnet ( how the order of layers(Blocks) for gradient flow during backprop is determined? Is it the exact reverse of the order defined in the def forward.
I am asking this question because I want to load pre-trained networks weights and then want to skip the last block during both forward and backward prop.
For doing so, I end up writing if else statement in the def forward of the resent. Now I am sure what is happening during backward prop. (