How do I restrict values in a trainable tensor to a specific range, say [0, 1] ?
z = torch.zeros(1, 1, 20, 20) z.requires_grad = True optimizer = optim.Adam([z], lr) for input in train_loader: z = z.clamp(0, 1) # Restrict trainable tensor values to specific range output = model(input + z) loss = loss_fn(output) optimizer.zero_grad() loss.backward() optimizer.step()
When I tried to clamp the tensor
z, it threw a runtime error as below:
RuntimeError: Trying to backward through the graph a second time, but the buffers have already been freed.