I have a question, if I want to save the gradient of the intermediate tensor generated in backward for future use, what should I do? Below is a small demo. Thanks!

```
import torch
import gc
a = []
class Exp(torch.autograd.Function):
@staticmethod
def forward(ctx, i):
result = torch.exp(i)
ctx.save_for_backward(result)
return result
@staticmethod
def backward(ctx, grad_output):
result, = ctx.saved_tensors
a.append(grad_output.detach())
return grad_output * result
exp = Exp()
x = torch.tensor([3., 4.], requires_grad=True)
y = exp.apply(x)
y.sum().backward()
gc.collect()
torch.cuda.empty_cache()
print(a)
```