Turn a sympy expression to a trainable equation

Hello! I have some sympy expressions of the form: c0*cos(c1*y) + c2 + c3*x**2 I want to turn the parameters c0, c1, c2, c3 to trainable pytorch parameters and run gradient descent on them (as I would do with an actual NN), to fit them to some data of the form (x,y,z). Is there a way to do this? Thank you!

1 Like

You can define cX as an nn.Parameter, which would make them trainable.
Also, you can wrap them in an nn.Module, which could make handling them easier:

class MyModel(nn.Module):
    def __init__(self):
        super(MyModel, self).__init__()
        self.c0 = nn.Parameter(torch.randn(1, 1))
        self.c1 = nn.Parameter(torch.randn(1, 1))
        self.c2 = nn.Parameter(torch.randn(1, 1))
        self.c3 = nn.Parameter(torch.randn(1, 1))
        
    def forward(self, x, y):
        out = self.c0 * torch.cos(self.c1 * y) + self.c2 + self.c3 * x**2
        return out

model = MyModel()
x, y = torch.randn(1, 1), torch.randn(1, 1)
out = model(x, y)
out.backward()
for name, param in model.named_parameters():
    print(name, param.grad)

Thank you for your reply! This is the final form that I want to get to. However, my question was if there is a way to go automatically from the sympy expression to this from (sorry if that was not clear). I have lots of sympy expressions, and I want to pass them automatically to a pytorch module for training, not write everything by hand. For example, for the line out = self.c0 * torch.cos(self.c1 * y) + self.c2 + self.c3 * x**2 I would like some function that would turn my sympy expression to that automatically, something like: out = sympy_to_pytorch(sympy_expression). And something similar for picking the parameters out of my sympy expression and doing the self.c0 = nn.Parameter(torch.randn(1, 1)) part on its own.

Oh, in that case I misunderstood the use case. :slight_smile:

That’s an interesting idea. I’m unfortunately not deeply familiar with sympy, but do you know, if the “parameters” have some kind of flags?
I assume we could use some parser to check for the “trainable flag” and create nn.Parameters out of them.

EDIT: I’m not sure, if the creation of the actual function would be easy. It seems we would need to have some mapping between sympy to PyTorch methods.

1 Like

I’ve been working on doing exactly that by adding functionality to sympy that allows reasoning about pytorch tensors.

I’ve documented a few examples here:

1 Like

That looks pretty cool! Thanks for sharing :slight_smile:

1 Like

There’s also sympytorch which looks relatively complete GitHub - patrick-kidger/sympytorch: Turning SymPy expressions into PyTorch modules.

There’s also cutcutcodec which is able to compile sympy expression to pytorch function with some optimizations. But not a lot of functions are yet implemented: cutcutcodec/core/compilation/sympy_to_torch.py · main · robin richard / cutcutcodec · GitLab

Now, this module is very complete: cutcutcodec/core/compilation/sympy_to_torch · main · robin richard / cutcutcodec · GitLab

  • It supports a lot of functions.
  • Factorization of the common sub patterns for a complete suppression of the redundancy.
  • Dynamic or compiled evaluation for safety or speed.
  • Sort the function argument for better broadcasting efficiency.
  • Cache smart management for variant and invariant input tensors.
  • Reuse internal variable for limiting RAM and speed-up by limiting allocation.