I’m trying to understand how optimization and backpropagation work.

In this code from Neural style transfer:

```
optimizer = optim.LBFGS([opt_img]);
n_iter=[0]
while n_iter[0] <= max_iter:
def closure():
optimizer.zero_grad()
out = vgg(opt_img, loss_layers)
layer_losses = [weights[a] * loss_fns[a](A, targets[a]) for a,A in enumerate(out)]
loss = sum(layer_losses)
loss.backward()
n_iter[0]+=1
return loss
optimizer.step(closure)
```

Do the pretrained weights of VGG optimized?

(Or, do both `opt_img`

and the weights of VGG optimized by LBFGS?)

(Do the weights of VGG stay same for every iterations, or they get optimized and different for each iterations?)