How to solve this optimization problem?

Hi everyone,

I am a beginner in PyTorch and I am trying to code one of the equations that is mentioned in a research paper.

This equation will solve by using a trained GAN model. Here the z is the latent space of trained gan and x is the generated image.

In terms of coding, how can I do it? How can I have the z of pretrained gan?

Hi Iram!

The equation you posted an image of means that you are to find
the values of z[i] that minimize MSELoss (x[?], G (z)), while
holding the parameters that define G constant.

(I believe that the use of i as the index on both sides of the
equation is a notational inconsistency – that’s why I used ? as
the index for x – but please explain if you believe otherwise.)

To use pytorch to do this, you would use pytorch’s autograd and
gradient-descent-optimization machinery to minimize your “loss”
with respect to z.

To do this you would start with z as a one-dimensional tensor
of length d (shape [d]), and wrap it as a Parameter. Initialize
z somehow – perhaps to zero or perhaps randomly or perhaps
to an initial guess, if you have one. Let me assume that G is a
pytorch “model,” that is, some sort of Module. Go through all of Gs
Parameters, and for each of them set requires_grad = False.
(This is so that you won’t unnecessarily calculate gradients for the
Parameters of G.) Instantiate a pytorch optimizer with z as its
Parameter. Then run an optimization loop who’s iteration is:

loss = torch.nn.MSELoss() (x, G (z))
opt.zero_grad()
loss.backward()
opt.step()

Each optimization step moves z in the direction that lowers the
mismatch between G (z) and x.

Best.

K. Frank

1 Like

Hi Franks,

The explanation of this equation is:

We first embed x1, x2 into latent space of G. Specifically, we optimize over the latent space to find vectors that produce inputs close to x1 and x2 in l2-norm.

I don’t think using of i on both sides of equation is notational inconsistency. Can you explain why you think this is inconsistent?

Can you please provide your thoughts on it?