Reusing variables with same name

I am trying to build a model that iterates two different layers, and below is my implementation.

class mymodel(nn.Module):
    def __init__(self, k_iters):
        super().__init__()
        self.k_iters = k_iters
        self.layer1 = ..
        self.layer2 =..

    def forward(self, x):
        x_k = x
        for k in range(self.k_iters):
            z_k = self.layer1(x_k)
            x_k = self.layer2(z_k)
        return x_k

Would there be any problem when I reuse the variables “z_k” and “x_k” for many times? I wonder if it interferes with the backward operation or tracking gradient…

No, it will have no issues with the variable name. For future references, if pytorch won’t be able to do something it will throw an error and tell you to explicitly clone the tensor.

1 Like