Problems of implementing decision tree

Hello everyone, I’m new PyTorch user and I’m currently trying to implement a soft version of decision tree using PyTorch. One of the problems that I’ve ran into is that, since the parameters of the model changes as the training going (because as the tree is growing, more tree node’s parameters are produced), is it possible to use the optimizer in torch.optim? I saw the if when using the built in optimizers in optim, all the parameters have to be collected at the beginning, and as an input to the optimizer. How can I use the optimizer when the number of parameters of my model change dynamically as training proceed ? thanks

http://pytorch.org/docs/0.3.0/optim.html#torch.optim.Optimizer.add_param_group