Hi all,
Is it possible to neatly encapsulate a torch.fx.GraphModule
that generates its own graph? (Based, say, on __init__
parameters.)
The obvious approach appears not to work:
In [8]: class Mod(fx.GraphModule):
...: def __init__(self):
...: super().__init__(self, fx.Graph())
...: def thingy(x):
...: return x* 2.
...: self.graph = fx.symbolic_trace(thingy)
...: self.recompile()
...:
In [9]: m = Mod()
In [10]: m(torch.ones(4))
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-10-56e7b4f5b37c> in <module>
----> 1 m(torch.ones(4))
~/anaconda3/envs/sciml/lib/python3.8/site-packages/torch/fx/graph_module.py in wrapped_call(self, *args, **kwargs)
306 try:
307 sys.excepthook = print_full_traceback
--> 308 return cls_call(self, *args, **kwargs)
309 finally:
310 sys.excepthook = old_excepthook
~/anaconda3/envs/sciml/lib/python3.8/site-packages/torch/fx/graph_module.py in wrapped_call(self, *args, **kwargs)
306 try:
307 sys.excepthook = print_full_traceback
--> 308 return cls_call(self, *args, **kwargs)
309 finally:
310 sys.excepthook = old_excepthook
~/anaconda3/envs/sciml/lib/python3.8/site-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
887 result = self._slow_forward(*input, **kwargs)
888 else:
--> 889 result = self.forward(*input, **kwargs)
890 for hook in itertools.chain(
891 _global_forward_hooks.values(),
TypeError: forward() takes 1 positional argument but 2 were given
Adding an explicit self
breaks recompile()
:
In [11]: class Mod(fx.GraphModule):
...: def __init__(self):
...: super().__init__(self, fx.Graph())
...: def thingy(self, x):
...: return x* 2.
...: self.graph = fx.symbolic_trace(thingy)
...: self.recompile()
...:
In [12]: m = Mod()
Traceback (most recent call last):
...
self.recompile()
File "<eval_with_key_9>", line 2
def forward(self, self, x):
^
SyntaxError: duplicate argument 'self' in function definition
Is this sort of architecture — module that generates its own compute graph — supported?
Thanks!