.backward() not working in a method of a class

Thank you for your reply, your guys are crazy, do you sleep? :sleeping: I really hope I can fix the code and run it in the latest Pytorch.

The code I posted here is an example, you could change it to y.backward(torch.randn(y.size())) or whatever, it is the same. The point is that I cannot embed a Function into another Function. . For example, I want to put function B.forward() in function A.forward(), cache the necessary values, and put function B.backward() in A.backward(). So B is a module in A.

The loss function I designed is a little bit complicated, there are three layers of function embedding, I found it failed starting from the second layer.

My code used to work in 0.1.10, it was a long time ago and failed in 0.2, at that time I downgraded to 0.1. I tried to downgrade to 0.1 yesterday, but I find everything changed, Pytorchvision, Cuda… when I tried to run it, nothing happened.