Variable get updated at the same time

How could I achieve it in Pytorch if I want to optimize a variable x but I have this constrain x + y = 1.0
When I optimize the x, I want to get y updated at the same time.
For example,

if __name__ == '__main__':
    x = torch.tensor(1.0, requires_grad=True)
    optimizer = SGD([x], lr=0.1)
    y = 1 - x
    z = y * 0.5 + x
    print('Beginning')
    print(x)
    print(y)
    print('---')
    optimizer.zero_grad()
    z.backward()
    optimizer.step()
    print('After')
    print(y)
    print(x)

I want to y get updated after the optimizer update x.

Just run y = 1 - x again after optimizer.step().
There is no way that two trainable variables are hard constrained.

You can add a soft constraint if you wish.

loss = z + some_coefficient * torch.abs(y + x - 1)
loss.backward()

In this case y must also be initialized as a variable requiring gradient.