I’ve encountered with some problems with PyTorch and wrote the minimal example, which didn’t work.
If to launch this code in jupyter notebook, kernel dies:
import torch from torch.autograd import Variable, Function # I've removed all computations from this class: class Some_class(Function): @staticmethod def forward(ctx, A, X): ctx.A = A return torch.ones(A.shape, X.shape).double() @staticmethod def backward(ctx, g): return Variable(torch.ones_like(ctx.A).double(), requires_grad=False), None f = Some_class.apply n, l = 7, 1 A = Variable(torch.rand(n, 3).double(), requires_grad=True) U = Variable(torch.rand(n, l).double()) b = Variable(torch.rand(n, l).double()) Z = b - f(A, U) Y = f(A, Z) res = torch.norm(Y) res.backward()
But if to run it from terminal, everything is OK.
Could you tell me is it a common bug with Jupyter?