Is there any official way to do Netwon's method in pytorch?

I wanted to quickly test Newton’s method (and second order methods). Is there any easy way to do this in pytorch?

I don’t know any official method but, this should be a good start:

import torch
from torch.autograd import Variable

def my_function(x):
    return (x**3 + 0.5)

def newton(func, guess, threshold = 1e-7):
    guess = Variable(guess, requires_grad=True)
    value = my_function(guess)
    while abs([0]) > threshold:
        value = my_function(guess)
        value.backward() -= (value / guess.grad).data

guess = torch.ones(1)
x = opt(my_function, guess)
print("x = %s, f(x) = %s" % (x.numpy()[0], my_function(x).numpy()[0]))
1 Like

Try torch.optim.LBFGS. This performs a Newton update with approximate Hessian.