.backward() not working in a method of a class

Hello everyone. I really need to embed a .backward() Function in a .forward() of another Function, but failed, help!

x = np.array([[1.,2.], [3.,4.], [5.,6.]])
x = Variable(torch.from_numpy(x).float(), requires_grad=True)
y = var(x)
y.backward(y)     
# it works fine up to here
print(x.grad)       

# but -----------------
class Y_fun(Function):
    def forward(self, x):
        y = var(x)
        y.backward(y)
        print(x.grad)      # exact same thing, but in a forward method, failed
        return y

def y_fun(x):
    return Y_fun()(x)

y = y_fun(x)

in the code var() is also a Function which has .forward() and .backward()
the error is:

RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

Actually, this piece of code should work well in Pytorch 0.1.10. But after a while, I want to continue my work in 0.4 it failed. I spent a whole day to figure out the reason, and I am despaired now.

Help, please.

Thank you!

Let’s assume we want to do this in 0.4.

  • Do you really need a Function - if so, what would be your backward?
  • I would probably first switch to new-style (0.2 or so) Functions first. There is an example in the torch.autograd.Function documentation. (Note that there is a subtle difference around variables vs. tensors if you look at 0.2 and 0.3.x docs, but let’s focus on 0.4).
  • I’m not quite sure that y.backward(y) - i.e. differentiating y w.r.t. the inputs while assuming that dloss/dy is y is something used very frequently.

Best regards

Thomas

Thank you for your reply, your guys are crazy, do you sleep? :sleeping: I really hope I can fix the code and run it in the latest Pytorch.

The code I posted here is an example, you could change it to y.backward(torch.randn(y.size())) or whatever, it is the same. The point is that I cannot embed a Function into another Function. . For example, I want to put function B.forward() in function A.forward(), cache the necessary values, and put function B.backward() in A.backward(). So B is a module in A.

The loss function I designed is a little bit complicated, there are three layers of function embedding, I found it failed starting from the second layer.

My code used to work in 0.1.10, it was a long time ago and failed in 0.2, at that time I downgraded to 0.1. I tried to downgrade to 0.1 yesterday, but I find everything changed, Pytorchvision, Cuda… when I tried to run it, nothing happened.

Can anyone help? Thank you!

Help still needed. Any help will be greatly appreciated!

I thought you wanted to move to PyTorch 0.4?
You’d probably still need a backward for the Function, but the following seems to work reasonably well.

import numpy as np
import torch
x = torch.tensor([[1.,2.], [3.,4.], [5.,6.]], dtype=torch.float, requires_grad=True)
y = torch.var(x)
y.backward(torch.randn(y.size()))     
# it works fine up to here
print(x.grad)       

class Y_fun(torch.autograd.Function):
    @staticmethod
    def forward(ctx, x):
        x_ = x.detach()
        x_.requires_grad=True
        with torch.enable_grad():
            y = torch.var(x_)
            y.backward(torch.randn(y.size()))
        print(x_.grad)      # exact same thing, but in a forward method, failed
        return y

def y_fun(x):
    return Y_fun.apply(x)

y = y_fun(x)
1 Like

Yes. it seems work. Thank you so much!
but could you please show me an example of debugging in .backward().
I greatly appreciate your help!

If you define a (static) backward function, you can print there.
For checking what’s going on with predefined functions, hooks would be the usual choice

    x = torch.tensor([[1.,2.], [3.,4.], [5.,6.]], dtype=torch.float, requires_grad=True)
    y = torch.var(x)
    def get_hook(name):
        def my_hook(g):
            print ("hook for",name,"grad",g)
        return my_hook
    y.register_hook(get_hook("y"))
    x.register_hook(get_hook("x"))
    y.backward(torch.randn(y.size())) 

I’m afraid there is only so much advice to offer when you don’t have an example that shows more about what you try to achieve.

Best regards

Thomas

1 Like

Thank you again, you helped me a lot!