AttributeError: 'module' object has no attribute 'define_optimizers'

The following error came out when implementing with reference to fineGAN. I can’t solve it after trying a lot.
Is there any good way?

Ubuntu18.04
python2.7.17
conda 4.8.2
pytorch 0.41

    def train(self):
        self.netG, self.netsD, self.num_Ds, start_count = load_network(self.gpus)
        avg_param_G = copy_G_params(self.netG)
        self.optimizerG,self.optimizersD = define_optimizers(self.netG,self.netsD)

        self.criterion = nn.BCELoss(reduce=False)
        self.criterion_one = nn.BCELoss()
        self.criterion_class = nn.CrossEntropyLoss()

        self.real_labels = \
            Variable(torch.FloatTensor(self.batch_size).fill_(1))
        self.fake_labels = \
            Variable(torch.FloatTensor(self.batch_size).fill_(0))

        nz = cfg.GAN.Z_DIM
        noise = Variable(torch.FloatTensor(self.batch_size, nz))
        fixed_noise = \
            Variable(torch.FloatTensor(self.batch_size, nz).normal_(0, 1))
        hard_noise = \
            Variable(torch.FloatTensor(self.batch_size, nz).normal_(0, 1)).cuda()

        self.patch_stride = float(4)    # Receptive field stride given the current discriminator architecture for background stage
        self.n_out = 24                 # Output size of the discriminator at the background stage; N X N where N = 24
        self.recp_field = 34            # Receptive field of each of the member of N X N

Traceback (most recent call last):
  File "main.py", line 106, in <module>
    algo.train()
  File "/home/hogehoge/finegan-master/code/trainer.py", line 360, in train
    self.optimizerG,self.optimizersD = define_optimizers(self.netG,self.netsD)
  File "/home/hogehoge/finegan-master/code/trainer.py", line 109, in define_optimizers
    betas=(0.5, 0.999))
  File "/home/hogehoge/anaconda3/envs/finegan-master/lib/python2.7/site-packages/torch/optim/adam.py", line 41, in __init__
    super(Adam, self).__init__(params, defaults)
  File "/home/hogehoge/anaconda3/envs/finegan-master/lib/python2.7/site-packages/torch/optim/optimizer.py", line 38, in __init__
    raise ValueError("optimizer got an empty parameter list")
ValueError: optimizer got an empty parameter list

thank you

Check if you are not returning the models correctly

True, there were other code mistakes. We are fixing it now. Thank you so much.