How to update a Pytorch variable and module parameters at the same time?

This is my Network structure:


class Generator(nn.Module):
    def __init__(self, latent_size=128): # 128 << 784 = 28*28
        super().__init__()
        self.latent_size = latent_size
        self.inputlayer = nn.Sequential(
            nn.Linear(latent_size, 25),
            nn.Tanh(),
        )
        self.hidden1 = nn.Sequential(
            nn.Linear(25, 100),
            nn.Tanh(),
        )
        self.hidden2 = nn.Sequential(
            nn.Linear(100, 200),
            nn.Tanh(),
        )
        self.deconv_out = nn.Linear(200, 784) # 784 = 28 * 28 for mnist

    def forward(self,input):
        net = self.inputlayer(input)
        net = self.hidden1(net)
        net = self.hidden2(net)
        net = self.deconv_out(net)
        return self.transform(net).view(-1,1,28, 28)


class DataGenerator(Generator):
    def __init__(self, latent_size=128):
        super().__init__(latent_size=latent_size)
        self.transform = lambda x: torch.sigmoid(x)

I was trying to pass in a latent variable Z from uniform distribution, and update Z and the network parameters during the training time.

nz = 128   # dimensionality of the latent code
beta = 0.1

data_gen = DataGenerator().to(device)
batch_size = 64
Z = torch.empty(batch_size, nz, device=device)
Z = Z.uniform_()
Z = Variable(Z,requires_grad=True)

lrate = 1e-4
data_gen_optimizer = optim.Adam(
    data_gen.parameters(), lr=lrate, betas=(0.5, 0.9))

I was trying to pass [Z,data_gen.parameters()] together to the optimizer at the same time, it gives the following error:

data_gen_optimizer = optim.Adam(
    [data_gen.parameters(),Z], lr=lrate, betas=(0.5, 0.9))
optimizer can only optimize Tensors, but one of the params is Module.parameters

So can anybody gives advice and help on how to let the optimizer to update the module parameters and Z at the same time?

Hi,

data_gen.parameters() returns a generator. So you most likely want to do something like: list(dat_gen.paramters()) + [Z,]

1 Like