My problem is best illustrated with an example.

I wrapped the `autograd.Function`

class into the `uniform`

function and used it in the forward function of `target_fn`

class, which works fine as below.

```
def uniform(k):
class qfn(torch.autograd.Function):
@staticmethod
def forward(ctx, input):
...
return out
@staticmethod
def backward(ctx, grad_output):
...
return grad_input
return qfn().apply
class target_fn(nn.Module):
def __init__(self, bit):
super(target_fn, self).__init__()
...
self.uni = uniform(k=bit)
def forward(self, x):
# self.max = max(x)
output = self.uni(x)
return output
```

However, I want the `self.max`

computed in the `target_fn.forward`

to be utilized by the `uniform`

for some preprocessing. And I tried something like below but error prompted:

```
def uniform(k, max_value):
class qfn(torch.autograd.Function):
@staticmethod
def forward(ctx, input):
...
out = out / max_value
return out
@staticmethod
def backward(ctx, grad_output):
...
return grad_input
return qfn().apply
class target_fn(nn.Module):
def __init__(self, bit):
super(target_fn, self).__init__()
...
self.uni = uniform(k=bit, max_value = self.max) # error: target_fn instantiation does not have self.max
def forward(self, x):
self.max = max(x)
output = self.uni(x)
return output
```

Funny thing is that I can actually access the `target_fn.max`

in debugger, but the line `self.uni = uniform(k=bit, max_value = self.max)`

cannot.

How could I fix it? Thank you.