# Use autograd for customized mathematical function

Hi, I would like to clarify is it possible to use autograd to find the derivative or to do gradient descent on a customized function? If yes a simple example is appreciated (e.g. how to find the derivative/ do gradient descent for f(x_1,x_2) = 2 * x_1^3 + 3 * x_2^2 where x_1, x_2 and f(x_1, x_2) are all scalars). Thanks!

@Zeeyuu Cubing and squaring are higher order polynomical equations.But If you are able to represent the formula in a series of matrix operations, you can easily use autograd to get the Jacobian

Do I need to define my function as a nn.Module? If so can you provide a toy example?
Thanks a lot.

example formula

``````x = torch.Tensor([1,2,3])
``````# Results