Hi, I would like to clarify is it possible to use autograd to find the derivative or to do gradient descent on a customized function? If yes a simple example is appreciated (e.g. how to find the derivative/ do gradient descent for f(x_1,x_2) = 2 * x_1^3 + 3 * x_2^2 where x_1, x_2 and f(x_1, x_2) are all scalars). Thanks!

@Zeeyuu Cubing and squaring are higher order polynomical equations.But If you are able to represent the formula in a series of matrix operations, you can easily use autograd to get the Jacobian

Do I need to define my function as a nn.Module? If so can you provide a toy example?

Thanks a lot.

example formula

```
x = torch.Tensor([1,2,3])
W = torch.nn.Parameter(torch.Tensor([[0.4],[0.5],[-0.6]]), requires_grad=True)
c = torch.matmul(x, W)
d = torch.nn.functional.sigmoid(c)
loss = torch.tensor([1.]) - d
loss.backward()
W.grad
```

```
# Results
tensor([[-0.2403],
[-0.4805],
[-0.7208]])
```

1 Like