I have a function f which takes in two variables x and y. I would like to minimize:
x = x - alpha * \frac{\partial f}{\partial x} for x and
y = y - alpha * \frac{\partial^2 f}{\partial x \partial y} for y.

I have two optimizers that take minimize x and y respectfully. The gradient step for x is simple, but I am not sure how to construct a gradient step for y. Any pointers would be very helpful! Thank you.