I am trying to get the Hessian Matrix of weights in a convolutional kernel. However, there is no API which can do the job like Tensorflow.
TensorFlow will give you the diagonal of the hessian, not the full hessian (if i am not confused).In current version of PyTorch there is no way to do this, but we will have this feature in version 0.2, the next major release.
AFAIK TensorFlow will return you a Hessian-vector product like most automatic differentiation software.
Any updates regarding second order derivatives in PyTorch?
The autograd branch, which will be merged soon, supports repeated application of .backward (or, more conveniently, a new autograd.differentiate operator) and can compute the exact Hessian-vector product.
Is autograd compatible with the master branch right now? Does autograd.differentiate support taking the gradient of a high order function of gradient? I dug around but couldn't find the roadmap of next release.