Take second gradient

Hello,

I have a function f which takes in two variables x and y. I would like to minimize:
x = x - alpha * \frac{\partial f}{\partial x} for x and
y = y - alpha * \frac{\partial^2 f}{\partial x \partial y} for y.

I have two optimizers that take minimize x and y respectfully. The gradient step for x is simple, but I am not sure how to construct a gradient step for y. Any pointers would be very helpful! Thank you.

do you mean something like this,

import torch

x = torch.tensor([1.], requires_grad=True)
y = torch.tensor([3.], requires_grad=True)

f = (x*y)**2

grads_x = torch.autograd.grad(f, x, create_graph=True)

grads_x # 2*x * y**2 = 2*1 * 3**2 = 18

(tensor([18.], grad_fn=),)

grads_x_y = torch.autograd.grad(grads_x, y)
grads_x_y # 2*x * 2*y = 2*1 * 2*3 = 12

(tensor([12.], grad_fn=),)

or

f = x**2 + y**2
grads_x = torch.autograd.grad(f, x, create_graph=True)
grads_x # 2*x = 2*1 = 2

(tensor([2.], grad_fn=),)

grads_x_y = torch.autograd.grad(grads_x, y, allow_unused=True)
grads_x_y

(None,)

1 Like