Will .clone() in def forward damage autograd graph?

Hi guys, My whole network follows a encoder-decoder architecture, which looks like this:

class vgg(nn.Module): #encoder
def init(self):

def forward(self, in_img):
x = in_img.clone() # I need an unchanged in_img
normalize = transforms.Normalize(mean=[0.485, 0.456, 0.406],
std=[0.229, 0.224, 0.225]) #I use pre_trained vgg16 weights, so normalize is needed
x = normalize(x.squeeze())
x = x.unsqueeze(0)
out = encoder(x)
return out, in_img

class whole_net(nn.Module): #decoder
def init(self):

def forward(self,input):
in, in_img = vgg(input)
out = decoder(in)
final_output = out + in_img
return final_output
Will the .clone() cause problem? And, if I want to check my net’s autograd graph, what should I do?

Hi,

You can use clone in your forward method without any problem.

1 Like