Testing if mathutils works with torch

I had a question. In general, what’s the best way to test if autograd works with certain external packages? That is, the gradient is being properly calculated…

In my case I am coding a GAN - the generator outputs some values but before I pass them into the discriminator, I have to perform some transformations (this is odometry data). Those transformations involve this package, mathutils. Here’s an example of a function I apply to the output of the generator (Note that R and T and rotation and translation values that determine a 6dof pose)


def to_rotation_matrix(R, T):
    if isinstance(R, torch.Tensor):
        R = quat2mat(R)
        T = tvector2mat(T)
        RT = torch.mm(T, R)
    else:
        R = R.to_matrix()
        R.resize_4x4()
        T = mathutils.Matrix.Translation(T)
        RT = T@R
    return RT

I want to ensure that using these sorts of custome functions isn’t going to mess with the autograd and that all the gradients will be taken correctly. I’m trying to find the best way to test whether or not autograd is correctly calculating the gradients, i.e., that it’s account for the transformations and rotations that I am doing. Is there a good way/simple way to test this? Thanks

I had a question. In general, what’s the best way to test if autograd works with certain external packages? That is, the gradient is being properly calculated…

In general, if a package is not using torch.Tensor, then it cannot be supported by autograd. So mathutils in this case won’t work as I don’t think you can give torch.Tensors as input.

Note that if you really need such function, you can always use it and use a custom autograd Function (doc here Extending PyTorch — PyTorch 2.1 documentation) to teach the autograd what is the backward pass for it. That way you will be able to use it within pytorch autograd.