Can you access ctx outside a torch.autograd.Function?

Looks like that worked! Thanks!! Unfortunately it only solves the main problem of grad out being what I want for the actual input grad_out of backward. A big part of the math used to use member variables that need to be modified both inside and outside the call to backward(). I started a new thread since it seems like a slightly different topic now: How to transition to functions not being allowed to have member variables