Trainable input z-vector to generator

How do I get the input noise vector to a generator to train, while freezing the generator weights? I’ve been trying to set requires_grad=True for the input, freezing the model weights, and training. However, the input as I print it out does not change over the coarse of training (nor does the model), so I’m clearly missing something.

Do I have to use the input as a parameter into the optimizer when I initialize the optimizer (like in neural style transfer), and if so, how do I do that with MSELoss? Or is there a simpler method?

# Get the noise vector
z = util.get_input_noise() # vector of size 100 with values [0,1)
z = z.detach().requires_grad_()

# Freeze the model
for param in model.parameters():
    param.requires_grad = False

model.eval() # Can I use model in eval mode for this or does this do something weird with backprop?

# Training loop (simplified -- assume certain vars already initialized)
for i in range(n_iters):
    probs = model.forward(z)
    loss = torch.zeros(1, requires_grad=True).to(device)
    loss = loss_fn(probs, target)
    loss.backward()
    optimizer.step()
    optimizer.zero_grad()

Thanks so much!

Related posts (did not work for me :confused: ):

Neural Style transfer: Neural Transfer Using PyTorch — PyTorch Tutorials 2.1.1+cu121 documentation

1 Like

I’d say you have to parse the input to the optimizer. Are you doing so?

1 Like

Yes, that’s what I was doing incorrectly - the optimizer needs to take in the input.