Restrict trainable tensor values to specific range

How do I restrict values in a trainable tensor to a specific range, say [0, 1] ?

z = torch.zeros(1, 1, 20, 20)
z.requires_grad = True
optimizer = optim.Adam([z], lr)
for input in train_loader:
   z = z.clamp(0, 1) # Restrict trainable tensor values to specific range
   output = model(input + z)
   loss = loss_fn(output)
   optimizer.zero_grad()
   loss.backward()
   optimizer.step()

When I tried to clamp the tensor z, it threw a runtime error as below:

RuntimeError: Trying to backward through the graph a second time, but the buffers have already been freed.

Any help would be appreciated. @ptrblck @albanD

When you clamp z in your forward, you should not use the same variable name as the one that was given to the optimizer.

z_clamped = z.clamp(0, 1)
output = model(input + z_clamped)

should work.

Note that the clamp operation will give 0 gradients if z is out of the range, so this might not have a very good behavior while training as as soon as you get out of the range, you will get a 0 gradient. Maybe doing z_clamped = z.sigmoid() will have a better behavior to avoid 0 gradients.

2 Likes