Hello,

I have a function f which takes in two variables x and y. I would like to minimize:
x = x - alpha * \frac{\partial f}{\partial x} for x and
y = y - alpha * \frac{\partial^2 f}{\partial x \partial y} for y.

I have two optimizers that take minimize x and y respectfully. The gradient step for x is simple, but I am not sure how to construct a gradient step for y. Any pointers would be very helpful! Thank you.

do you mean something like this,

import torch

f = (x*y)**2

grads_x # 2*x * y**2 = 2*1 * 3**2 = 18


grads_x_y = torch.autograd.grad(grads_x, y)
grads_x_y # 2*x * 2*y = 2*1 * 2*3 = 12


or

f = x**2 + y**2

grads_x_y = torch.autograd.grad(grads_x, y, allow_unused=True)