Trying to backward through the graph a second time, but the saved intermediate results have already been freed. Specify retain_graph=True when calling backward the first time

Thanks for the update.
Your losses do not depend on any parameters used in the model as they are calculated using the inputs:

            set_inputs = [input_dt_1, input_dt_2, input_dt_3]
            
            self.optimizer.zero_grad()
            outputs = self.net(set_inputs)
            
            #l1_loss that minimizes the reconstruction error when other inputs are inexistent
            loss=nn.MSELoss()
            l1_loss, set_hidden_rep = self.calculate_l1_loss(set_inputs, loss)
            
            #l2_loss that calculates the correlation between hidden representations to encourage the hidden units 
            #of the representation to be shared between the representations
            l2_loss = self.calculate_l2_loss(set_hidden_rep, lambda_val =0.02)

As you can see, l1_loss is calculated using set_inputs and l2_loss using the output of calculate_l1_loss.