NotImplementedError('wrap must be called at the top level of a module')

NotImplementedError(‘wrap must be called at the top level of a module’)

First, I define an operator in add.cpp

torch::Tensor add_ops(torch::Tensor x, torch::Tensor y){
    torch::Tensor z  = x + y;
    return z;
}

PYBIND11_MODULE(TORCH_EXENSION_NAME, m){
    m.def("add_ops", &add_ops, "add ops");
}

then ,I use python setup.py build_ext --inplace to change add.cpp into add.os and use it.

pyadd.py

import torch
import add_ops
from torch.fx import wrap

add = add.add_ops
wrap("aa")

def pyadd(n1, n2):
    return add(n1, n2)

I still use python setup.py build_ext --inplace to convert pyadd.py to pyadd.os

but, when I use the pyadd.os such as :face_with_head_bandage:

from pyadd import pyadd

pyadd(torch.randn(10), torch.randn(10))

Then , there is an error about wrap must be called at the top level of a module

why? please help me thanks.