Hi! I wrote below function that I intend to use as the loss function.

```
class MyCriterion(torch.autograd.Function):
def __init__(self):
self.alpha = .0005
def forward(self, input, target, epoch, isLabeled):
loss = F.cross_entropy(input, target)
self.save_for_backward(input, target, epoch, isLabeled, loss)
print(self.saved_tensors) #returns ()
if (isLabeled.data > 0).all():
return Variable(loss.data * self.alpha * epoch.data)
return loss
def backward(self, grad_output):
input, target, epoch, isLabeled, loss, = self.saved_tensors
grad_input = loss.backward()
return grad_input
my_criterion = MyCriterion()
x = Variable(torch.randn(11, 10).type(torch.FloatTensor))
y = Variable(torch.range(1,6, .5).type(torch.LongTensor))
a = torch.from_numpy(np.array([0]))
b = torch.from_numpy(np.array([1]))
c = torch.from_numpy(np.array([10.0]))
print(x)
# print(torch.from_numpy(np.array([10])))
first_loss = my_criterion.forward(x, y, Variable(c.float()), Variable(a))
print(my_criterion.backward(first_loss))
second_loss = my_criterion.forward(x, y, Variable(c.float()), Variable(b))
print(my_criterion.backward(second_loss))
```

When I do this, I have below error -

```
---> 18 input, target, epoch, isLabeled, loss, = self.saved_tensors
19 grad_input = loss.backward()
20 return grad_input
ValueError: not enough values to unpack (expected 5, got 0)
```

Is there something I am missing? How can I access saved tensors? Is there any other documentation than autograd that has more examples of autograd functions? Thanks a lot!