Old problem but strange things:Trying to backward through the graph a second time

old problem but I didn’t use the intermediate results or calculate anything.
below is the code
and I found unexpectedly when I remove the sentence of

            for p in llist:
                print(p)

the code work!
but with the print(p) I will encounter this problem.
I wonder what happened?

 for epoch in range(epochs):

            target_model.train()
            for idx,(data, label) in enumerate(self.holded_dataloader):
                optimizer.zero_grad()

                z = self.GM_model.G.gen_noise(label.size()[0]).cuda()
                trigger = self.GM_model.G(z)
                data, label = data.cuda(), label.cuda()
                                
                poisoned_data = data.detach().clone()
                poisoned_data = self.GM_model.add_trigger( poisoned_data, self.GM_model.transform(
                    (trigger).view(-1, (1 if self.args.dataset=='mnist' else 3), self.args.ylen, self.args.xlen),
                    self.GM_model.dataset_stats(self.args.dataset)), self.args)
                
                ploss = lossfunc(target_model(poisoned_data), label.detach().clone().fill_(self.target_class))
                bloss = self.unlearning_step(target_model, self.teacher, poisoned_data.clone(), label.clone() , optimizer, self.args.device, self.KL_temperature)
                
                loss = beta*bloss- p*ploss
                
                llist.append(loss)
                
                loss.backward()
                optimizer.step()     
          
            # print('loss is ',end='')
            # for p in llist:
            #     print(p)
            # print()

the error report is
Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.

This reminds me of this issue which raised a similar error.

Could you add the missing code parts to make the snippet executable, please?

Really thanks for the reply!
The code I put forwards recently was modified, so I can’t reproduce it either.
As I was not available these days, I will check it out and try to reproduce it later.
Thanks anyway for your reply!