I have a forward method as the following code (simplified):
def forward(self, Y, D):
L0_1=self.relu(self.low_level(D))
cas_input=self.relu(self.conv_genesis(L0_1))
for i in range(self.N+1):
L_1=self.relu(self.D_convs[i](cas_input)) #leak1
L_2=torch.cat([L_1,Y_features[i]],1) #leak2
cas_input=self.relu(self.D_fusions[i](L_2)) #leak3
output=self.reconstruct(cas_input)+D
return output
I check with nvidia-smi and found memory leak in every round in “for” loop.