Gradcheck indicates analytical jacobian is a zero matrix after change Pytorch to 1.1.0

In Pytorch 1.1.0 Python 3.7.3
I define a custom layer using autograd.Function. I’d like to check whether I defined the backward path correctly.
I set input1 and input2 as Variables, also set requires_grad = True for both of them.Then I called:
autograd.gradcheck(myLayer, (input1, input2)).
This raises error “Jacobian mismatch for output 1 with respect to input 0”, it shows that analytical jacobian is a zero matrix (all entries are 0).
However, if I change Pytorch back to 0.4.0, the analytical jacobian is the same as what I defined in backward.

Very confusing about what’s the difference there.
Any message will help!


We cannot really tell you much about it unless you’d share a bit more about myLayer. Would you be able to narrow down a bit which function might be causing this?

Best regards