Custom optimizer got an empty parameter list

Hello, new here. I am trying to create a custom optimizer in PyTorch, where the backprop takes place in a meta RL policy. However, I am seeing the above error. My models work fine on Adam and SGD, but not my optimizer.


class MetaBackProp(torch.optim.Optimizer):
	def __init__(self, params):
		self.param_shape_list = np.array([])
		for param in list(params):
			np.append(self.param_shape_list, list(param.size()))

		pseudo_lr = 1e-4
		pseudo_defaults = dict(lr=pseudo_lr)
		length = 100 #TODO: get shape, flatten, multiply...
		self.policy = AEPolicy(length)
		self.policy_optim = torch.optim.Adam(self.policy.parameters(), lr=pseudo_lr)
		super(MetaBackProp, self).__init__(params, pseudo_defaults)

	def step(self, closure=None):
		params =[p.view(-1) for p in self.param_groups])


Traceback (most recent call last):
  File "", line 6, in <module>
    gan = CycleGAN()
  File "/home/ai/Projects_v2/R/", line 32, in __init__
    self.discriminator2_optim = MetaBackProp(self.discriminator2.parameters())
  File "/home/ai/Projects_v2/R/", line 34, in __init__
    super(MetaBackProp, self).__init__(params, pseudo_defaults)
  File "/home/ai/anaconda3/lib/python3.7/site-packages/torch/optim/", line 46, in __init__
    raise ValueError("optimizer got an empty parameter list")
ValueError: optimizer got an empty parameter list

Once you call list(params) you are exhausting the generator. Thus, when you call super init, it’s empty.