Custom autograd.function with heavy calculations required only once

I want to write a custom function.
I noticed that there were two ways of doing it.

I prefer the old style to the new style because I can do heavy calculations required only once in __init__ like this.

class MyFunc(torch.autograd.Function):
    def __init__(self, arg1):
        super().__init__()
        self.arg1_dash = heavy_calculation(arg1) 

    def forward(self, x1):
        return x1 + self.arg1_dash

    def backward(self, grad_out):
        return ...

Is there a way to mimic the old style with a new style?

you could either use ctx.save_for_backward(object) (or) play around with global variables as below.

    def api(self):
        global var_buffer
        if 'var_buffer' not in globals():
            var_buffer = 5 # executed only during first time 
        else:
            print('setter found', var_buffer)
        return 1