I want to use a optimizer to optimize a part of my input,how should i do?

for example:


its cause Error:
· raise ValueError(“can’t optimize a non-leaf Tensor”)
ValueError: can’t optimize a non-leaf Tensor·
Can somebody help me?

Indexing your input is already registered as an operation by autograd, hence the tensor you pass to Adam is a “non-leaf” tensor in the internal computation graph.

As a workaround you should consider doing the indexing part at a later stage, e.g.,

import torch
from torch.nn import Linear, Sequential, Flatten
from torch.optim import Adam


model = Sequential(
    Flatten(1, -1),
    Linear(32 * 32, 1),
)

model_input = torch.rand(1, 32, 32)
optimization_mask = torch.zeros(1, 32, 32, dtype=torch.bool)
optimization_mask[:, 27:32, 27:32] = True

num_input_params = int(optimization_mask.sum().item())
x_optimized = torch.rand(num_input_params, requires_grad=True)
optim = Adam([x_optimized], lr=1e-1)

for _ in range(1000):
    x = model_input.clone()
    x[optimization_mask] = x_optimized
    loss = (model(x) - 10)**2
    optim.zero_grad()
    loss.backward()
    optim.step()

# should equal approx. 10.0
print(model(x).detach())

Thank you!Thats help me a lot!

1 Like