yatra9
(Yatra9)
September 8, 2018, 6:14pm
1
I want to write a custom function.
I noticed that there were two ways of doing it.
I created a simple autograd function, let’s call it F (based on torch.autograd.Function).
What’s the difference between calling
a = F.apply(args)
and instantiating, then calling, like this :
f = F()
a = f(args)
The two versions seem to be used in pytorch code, and in examples
I prefer the old style to the new style because I can do heavy calculations required only once in __init__
like this.
class MyFunc(torch.autograd.Function):
def __init__(self, arg1):
super().__init__()
self.arg1_dash = heavy_calculation(arg1)
def forward(self, x1):
return x1 + self.arg1_dash
def backward(self, grad_out):
return ...
Is there a way to mimic the old style with a new style?
you could either use ctx.save_for_backward(object)
(or) play around with global variables as below.
def api(self):
global var_buffer
if 'var_buffer' not in globals():
var_buffer = 5 # executed only during first time
else:
print('setter found', var_buffer)
return 1