Torch.compile Numpy code throws mean() arguments error

Hello,

I have a piece of Numpy code that is running fine, containing a function named depth_consistency which returns a 1D array.
When I call this function, I then usually have to compute the mean of absolute values using np.mean(np.abs()).
Since this computation is often repeated, I instead moved it inside the function too.

Now this code is integrated into a DL training pipeline with pytorch, so I have been using torch.compile on this function.
Before I moved the mean computation inside the function, everything was working fine, Numpy or pytorch was giving me the same result (using torch.mean(torch.abs()) on the function’s output).
However since I moved the mean computation inside the function, I get the following error when running the pytorch integration (the numpy version still works like a charm):


W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0] Backend compiler exception
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]   Explanation: Backend compiler `inductor` failed with aten.equal.default
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0] 
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]     While executing %ratio12 : [num_users=1] = call_function[target=torch._dynamo.utils.wrapped_median](args = (%truediv,), kwargs = {})
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]     GraphModule: class GraphModule(torch.nn.Module):
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         def forward(self, L_stack0_: "f64[191754][1]", L_depth2_masked_: "f32[191754][1]"):
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             l_stack0_ = L_stack0_
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             l_depth2_masked_ = L_depth2_masked_
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]              # File: test_depth_script.py:254 in torch_dynamo_resume_in_depth_consistency_at_251, code: ratio12 = np.median(depth2_masked / depth1_masked)
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             truediv: "f64[191754][1]" = l_depth2_masked_ / l_stack0_
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             ratio12: "f64[1][1]" = torch__dynamo_utils_wrapped_median(truediv);  truediv = None
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]              # File: test_depth_script.py:255 in torch_dynamo_resume_in_depth_consistency_at_251, code: depth12_diff = depth2_masked / ratio12 - depth1_masked
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             truediv_1: "f64[191754][1]" = l_depth2_masked_ / ratio12;  l_depth2_masked_ = ratio12 = None
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             depth12_diff: "f64[191754][1]" = truediv_1 - l_stack0_;  truediv_1 = l_stack0_ = None
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]              # File: test_depth_script.py:273 in torch_dynamo_resume_in_depth_consistency_at_251, code: depth12_cons = np.linalg.norm(depth12_diff)
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             depth12_cons: "f64[][]" = torch__dynamo_utils_wrapped_norm(depth12_diff);  depth12_cons = None
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]              # File: test_depth_script.py:274 in torch_dynamo_resume_in_depth_consistency_at_251, code: depth12_cons = np.mean(np.abs(depth12_diff))
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             wrapped_absolute: "f64[191754][1]" = torch__dynamo_utils_wrapped_absolute(depth12_diff)
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             depth12_cons_1: "f64[][]" = torch__dynamo_utils_wrapped_mean(wrapped_absolute);  wrapped_absolute = None
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             return (depth12_diff, depth12_cons_1)
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0] 
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]     Original traceback:
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]       File "test_depth_script.py", line 254, in torch_dynamo_resume_in_depth_consistency_at_251
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         ratio12 = np.median(depth2_masked / depth1_masked)
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]     . Adding a graph break.
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]   Hint: Report an issue to the backend compiler repo.
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0] 
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]   Developer debug context: Backend: inductor
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]     Exception:aten.equal.default
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0] 
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]     While executing %ratio12 : [num_users=1] = call_function[target=torch._dynamo.utils.wrapped_median](args = (%truediv,), kwargs = {})
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]     GraphModule: class GraphModule(torch.nn.Module):
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         def forward(self, L_stack0_: "f64[191754][1]", L_depth2_masked_: "f32[191754][1]"):
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             l_stack0_ = L_stack0_
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             l_depth2_masked_ = L_depth2_masked_
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]              # File: test_depth_script.py:254 in torch_dynamo_resume_in_depth_consistency_at_251, code: ratio12 = np.median(depth2_masked / depth1_masked)
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             truediv: "f64[191754][1]" = l_depth2_masked_ / l_stack0_
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             ratio12: "f64[1][1]" = torch__dynamo_utils_wrapped_median(truediv);  truediv = None
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]              # File: test_depth_script.py:255 in torch_dynamo_resume_in_depth_consistency_at_251, code: depth12_diff = depth2_masked / ratio12 - depth1_masked
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             truediv_1: "f64[191754][1]" = l_depth2_masked_ / ratio12;  l_depth2_masked_ = ratio12 = None
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             depth12_diff: "f64[191754][1]" = truediv_1 - l_stack0_;  truediv_1 = l_stack0_ = None
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]              # File: test_depth_script.py:273 in torch_dynamo_resume_in_depth_consistency_at_251, code: depth12_cons = np.linalg.norm(depth12_diff)
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             depth12_cons: "f64[][]" = torch__dynamo_utils_wrapped_norm(depth12_diff);  depth12_cons = None
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]              # File: test_depth_script.py:274 in torch_dynamo_resume_in_depth_consistency_at_251, code: depth12_cons = np.mean(np.abs(depth12_diff))
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             wrapped_absolute: "f64[191754][1]" = torch__dynamo_utils_wrapped_absolute(depth12_diff)
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             depth12_cons_1: "f64[][]" = torch__dynamo_utils_wrapped_mean(wrapped_absolute);  wrapped_absolute = None
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]             return (depth12_diff, depth12_cons_1)
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0] 
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]     Original traceback:
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]       File "test_depth_script.py", line 254, in torch_dynamo_resume_in_depth_consistency_at_251
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         ratio12 = np.median(depth2_masked / depth1_masked)
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0] 
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]     Traceback:
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]       File "test_depth_script.py", line 276, in torch_dynamo_resume_in_depth_consistency_at_251
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0]         return coords, depth12_diff, depth12_cons
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0] 
W0821 11:31:59.761000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0] 
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1] Backend compiler exception
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]   Explanation: Backend compiler `inductor` failed with aten.equal.default
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1] 
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]     While executing %ratio12 : [num_users=1] = call_function[target=torch._dynamo.utils.wrapped_median](args = (%truediv,), kwargs = {})
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]     GraphModule: class GraphModule(torch.nn.Module):
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         def forward(self, L_stack0_: "f64[191754][1]", L_depth2_masked_: "f32[191754][1]"):
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             l_stack0_ = L_stack0_
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             l_depth2_masked_ = L_depth2_masked_
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]              # File: test_depth_script.py:254 in torch_dynamo_resume_in_depth_consistency_at_251, code: ratio12 = np.median(depth2_masked / depth1_masked)
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             truediv: "f64[191754][1]" = l_depth2_masked_ / l_stack0_
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             ratio12: "f64[1][1]" = torch__dynamo_utils_wrapped_median(truediv);  truediv = None
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]              # File: test_depth_script.py:255 in torch_dynamo_resume_in_depth_consistency_at_251, code: depth12_diff = depth2_masked / ratio12 - depth1_masked
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             truediv_1: "f64[191754][1]" = l_depth2_masked_ / ratio12;  l_depth2_masked_ = ratio12 = None
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             depth12_diff: "f64[191754][1]" = truediv_1 - l_stack0_;  truediv_1 = l_stack0_ = None
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]              # File: test_depth_script.py:273 in torch_dynamo_resume_in_depth_consistency_at_251, code: depth12_cons = np.linalg.norm(depth12_diff)
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             depth12_cons: "f64[][]" = torch__dynamo_utils_wrapped_norm(depth12_diff);  depth12_cons = None
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]              # File: test_depth_script.py:274 in torch_dynamo_resume_in_depth_consistency_at_251, code: depth12_cons = np.mean(np.abs(depth12_diff))
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             wrapped_absolute: "f64[191754][1]" = torch__dynamo_utils_wrapped_absolute(depth12_diff)
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             depth12_cons_1: "f64[][]" = torch__dynamo_utils_wrapped_mean(wrapped_absolute);  wrapped_absolute = None
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             return (depth12_diff, depth12_cons_1)
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1] 
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]     Original traceback:
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]       File "test_depth_script.py", line 254, in torch_dynamo_resume_in_depth_consistency_at_251
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         ratio12 = np.median(depth2_masked / depth1_masked)
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]     . Adding a graph break.
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]   Hint: Report an issue to the backend compiler repo.
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1] 
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]   Developer debug context: Backend: inductor
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]     Exception:aten.equal.default
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1] 
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]     While executing %ratio12 : [num_users=1] = call_function[target=torch._dynamo.utils.wrapped_median](args = (%truediv,), kwargs = {})
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]     GraphModule: class GraphModule(torch.nn.Module):
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         def forward(self, L_stack0_: "f64[191754][1]", L_depth2_masked_: "f32[191754][1]"):
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             l_stack0_ = L_stack0_
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             l_depth2_masked_ = L_depth2_masked_
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]              # File: test_depth_script.py:254 in torch_dynamo_resume_in_depth_consistency_at_251, code: ratio12 = np.median(depth2_masked / depth1_masked)
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             truediv: "f64[191754][1]" = l_depth2_masked_ / l_stack0_
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             ratio12: "f64[1][1]" = torch__dynamo_utils_wrapped_median(truediv);  truediv = None
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]              # File: test_depth_script.py:255 in torch_dynamo_resume_in_depth_consistency_at_251, code: depth12_diff = depth2_masked / ratio12 - depth1_masked
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             truediv_1: "f64[191754][1]" = l_depth2_masked_ / ratio12;  l_depth2_masked_ = ratio12 = None
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             depth12_diff: "f64[191754][1]" = truediv_1 - l_stack0_;  truediv_1 = l_stack0_ = None
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]              # File: test_depth_script.py:273 in torch_dynamo_resume_in_depth_consistency_at_251, code: depth12_cons = np.linalg.norm(depth12_diff)
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             depth12_cons: "f64[][]" = torch__dynamo_utils_wrapped_norm(depth12_diff);  depth12_cons = None
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]              # File: test_depth_script.py:274 in torch_dynamo_resume_in_depth_consistency_at_251, code: depth12_cons = np.mean(np.abs(depth12_diff))
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             wrapped_absolute: "f64[191754][1]" = torch__dynamo_utils_wrapped_absolute(depth12_diff)
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             depth12_cons_1: "f64[][]" = torch__dynamo_utils_wrapped_mean(wrapped_absolute);  wrapped_absolute = None
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]             return (depth12_diff, depth12_cons_1)
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1] 
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]     Original traceback:
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]       File "test_depth_script.py", line 254, in torch_dynamo_resume_in_depth_consistency_at_251
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         ratio12 = np.median(depth2_masked / depth1_masked)
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1] 
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]     Traceback:
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]       File "test_depth_script.py", line 276, in torch_dynamo_resume_in_depth_consistency_at_251
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1]         return coords, depth12_diff, depth12_cons
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1] 
W0821 11:31:59.819000 3707 site-packages/torch/_dynamo/exc.py:514] [8/0_1] 
test_depth_script.py:254: DeprecationWarning: __array_wrap__ must accept context and return_scalar arguments (positionally) in the future. (Deprecated NumPy 2.0)
  ratio12 = np.median(depth2_masked / depth1_masked)
test_depth_script.py:255: DeprecationWarning: __array_wrap__ must accept context and return_scalar arguments (positionally) in the future. (Deprecated NumPy 2.0)
  depth12_diff = depth2_masked / ratio12 - depth1_masked
test_depth_script.py:274: DeprecationWarning: __array_wrap__ must accept context and return_scalar arguments (positionally) in the future. (Deprecated NumPy 2.0)
  depth12_cons = np.mean(np.abs(depth12_diff))
Traceback (most recent call last):
  File "test_depth_script.py", line 358, in <module>
    coord_diff_torch, depth12_diff_torch, depth12_cons_torch = depth_consistency_torch(coords_torch, depth1_aligned_torch, depth2_est_torch)
                                                               ~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.13/site-packages/torch/_dynamo/eval_frame.py", line 655, in _fn
    return fn(*args, **kwargs)
  File "test_depth_script.py", line 243, in depth_consistency
    coords = coords[mask12,:]
  File "test_depth_script.py", line 245, in torch_dynamo_resume_in_depth_consistency_at_243
    depth1_masked = depth1[mask12]
  File "test_depth_script.py", line 249, in torch_dynamo_resume_in_depth_consistency_at_245
    coords = coords[mask12,:]
  File "test_depth_script.py", line 250, in torch_dynamo_resume_in_depth_consistency_at_249
    depth2_masked = depth2_masked[mask12]
  File "test_depth_script.py", line 251, in torch_dynamo_resume_in_depth_consistency_at_250
    depth1_masked = depth1_masked[mask12]
  File "test_depth_script.py", line 274, in torch_dynamo_resume_in_depth_consistency_at_251
    depth12_cons = np.mean(np.abs(depth12_diff))
  File "/usr/lib/python3.13/site-packages/numpy/_core/fromnumeric.py", line 3858, in mean
    return mean(axis=axis, dtype=dtype, out=out, **kwargs)
TypeError: mean() received an invalid combination of arguments - got (dtype=NoneType, out=NoneType, axis=NoneType, ), but expected one of:
 * (*, torch.dtype dtype = None)
 * (tuple of ints dim, bool keepdim = False, *, torch.dtype dtype = None)
 * (tuple of names dim, bool keepdim = False, *, torch.dtype dtype = None)


Am I right to assume this is purely a torch.compile issue (as suggested in the error) and that I cannot fix it on my side?
If that’s the case, I will try to fill a bug report.

If not, would it then be related to the NumPy 2.0 DeprecationWarning about _array_wrap_ and/or that I am doing something I should not in the NumPy code?
I assume not since running only the NumPy version, those warning do not appear, but I am not literate enough in how torch.compile and NumPy interacts to be sure of that.

As a side note, replacing the np.mean(np.abs()) computation with np.linalg.norm() in the function makes the code runs fine on both versions, however the output of pytorch and Numpy have a slight difference in the vector size and final value:

# Numpy vector size and norm output
(191753,)
225.48908773551372

# Pytorch version
torch.Size([191754])
225.48914043521384

So could the issue be about something happening in the computation of the vector itself before the mean/norm computation?

Many thanks in advance for any information you can provide!