[DOUBT] Copying of Tensors

I have a AutoEncoder Model in which I return both the encoding and the reconstructed output. This is the forward method that I use:

def forward(self, x):
        for module in self.encoder:
            x = module(x)
        
        encoding = x*1 # added the multiply just for testing
        
        for module in self.decoder:
            x = module(x)
        
        return x, encoding

In line 4, is the simple assign I am doing to copy wrong?
When I check the output, they both refer to different grad_fn. Am I missing something here?

I’m not sure, what your use case is, but encoding should be assigned to the multiplication with the “old” x tensor, since x will be overwritten in the following loop.

1 Like

Thank you for the reply. I was thinking that I would have to use .clone() for copying the tensor.

I am planning to use the encoding as well as x for optimisation. Will that work fine in this case?