RuntimeError: Python builtin <built-in method apply of FunctionMeta object at 0x55dad1b31680> is currently not supported in Torchscript:

I was running others code in https://github.com/Xilinx/brevitas/tree/master/examples,and encounted this question.Is anyone know how to deal with that?

my pytorch version is 1.4 for GPU,and it works well.CUDA version is 10.1,GPU is 2080ti.

very thanks

Traceback (most recent call last):
File “imagenet_val.py”, line 146, in
main()
File “imagenet_val.py”, line 42, in main
model = modelsarch
File “/home/mjx/brevitasmaster/examples/models/mobilenetv1.py”, line 121, in quant_mobilenet_v1
net = MobileNet(channels=channels,first_stage_stride=first_stage_stride,bit_width=bit_width)
File “/home/mjx/brevitasmaster/examples/models/mobilenetv1.py”, line 75, in init
weight_bit_width=FIRST_LAYER_BIT_WIDTH, activation_scaling_per_channel=True,act_bit_width=bit_width)
File “/home/mjx/brevitasmaster/examples/models/mobilenetv1.py”, line 28, in init
self.conv = make_quant_conv2d(in_channels=in_channels,out_channels=out_channels,kernel_size=kernel_size,stride=stride,padding=padding,groups=groups,bias=False,bit_width=weight_bit_width)
File “/home/mjx/brevitasmaster/examples/models/common.py”, line 74, in make_quant_conv2d
weight_scaling_min_val=weight_scaling_min_val)
File “/home/mjx/brevitasmaster/brevitas/nn/quant_conv.py”, line 179, in init
override_pretrained_bit_width=weight_override_pretrained_bit_width)
File “/home/mjx/brevitasmaster/brevitas/proxy/parameter_quant.py”, line 357, in init
self.re_init_tensor_quant()
File “/home/mjx/brevitasmaster/brevitas/proxy/parameter_quant.py”, line 360, in re_init_tensor_quant
self.tensor_quant = self.lazy_tensor_quant_init(tracked_parameter_list=self._tracked_parameter_list)
File “/home/mjx/brevitasmaster/brevitas/proxy/parameter_quant.py”, line 146, in _weight_quant_init_impl
affine=scaling_impl_type == ScalingImplType.AFFINE_STATS)
File “/home/mjx/.conda/envs/py/lib/python3.7/site-packages/torch/jit/init.py”, line 1453, in init_then_script
original_init(self, *args, **kwargs)
File “/home/mjx/brevitasmaster/brevitas/core/scaling.py”, line 246, in init
stats_output_shape=stats_output_shape)
File “/home/mjx/.conda/envs/py/lib/python3.7/site-packages/torch/jit/init.py”, line 1453, in init_then_script
original_init(self, *args, **kwargs)
File “/home/mjx/brevitasmaster/brevitas/core/scaling.py”, line 171, in init
self.restrict_scaling = RestrictValue(restrict_scaling_type, FloatToIntImplType.CEIL, scaling_min_val)
File “/home/mjx/.conda/envs/py/lib/python3.7/site-packages/torch/jit/init.py”, line 1453, in init_then_script
original_init(self, *args, **kwargs)
File “/home/mjx/brevitasmaster/brevitas/core/restrict_val.py”, line 82, in init
float_to_int_impl = CeilSte()
File “/home/mjx/.conda/envs/py/lib/python3.7/site-packages/torch/jit/init.py”, line 1456, in init_then_script
self.dict["_actual_script_module"] = torch.jit._recursive.create_script_module(self, stubs)
File “/home/mjx/.conda/envs/py/lib/python3.7/site-packages/torch/jit/_recursive.py”, line 296, in create_script_module
return create_script_module_impl(nn_module, concrete_type, cpp_module, stubs)
File “/home/mjx/.conda/envs/py/lib/python3.7/site-packages/torch/jit/_recursive.py”, line 340, in create_script_module_impl
create_methods_from_stubs(concrete_type, stubs)
File “/home/mjx/.conda/envs/py/lib/python3.7/site-packages/torch/jit/_recursive.py”, line 259, in create_methods_from_stubs
concrete_type._create_methods(defs, rcbs, defaults)
File “/home/mjx/.conda/envs/py/lib/python3.7/site-packages/torch/jit/_recursive.py”, line 555, in try_compile_fn
return torch.jit.script(fn, _rcb=rcb)
File “/home/mjx/.conda/envs/py/lib/python3.7/site-packages/torch/jit/init.py”, line 1281, in script
fn = torch._C._jit_script_compile(qualified_name, ast, _rcb, get_default_args(obj))
RuntimeError:
Python builtin <built-in method apply of FunctionMeta object at 0x55dad1b31680> is currently not supported in Torchscript:
File “/home/mjx/brevitasmaster/brevitas/function/ops_ste.py”, line 93

"""
return ceil_ste_fn.apply(x)
       ~~~~~~~~~~~~~~~~~ <--- HERE

‘ceil_ste’ is being compiled since it was called from ‘CeilSte.forward’
File “/home/mjx/brevitasmaster/brevitas/core/function_wrapper.py”, line 79
@torch.jit.script_method
def forward(self, x: torch.Tensor):
return ceil_ste(x)
~~~~~~~~~~ <— HERE

Hi,

I don’t think torchscript supports custom autograd Function. Was that working before?

Sorry I dont know,I run other guy‘s code。

so maybe I can try to modify that

thanks a lot