PyTorch low test coverage

Looked into these similar questions but they don’t have any response.
I am trying to calculate Pytorch test coverage and used coverage tool: https://coverage.readthedocs.io/

cd pytorch/test
coverage run --source torch -m run_test --exclude test_multiprocessing_spawn test_cpp_api_parity test_jit test_quantization test_overrides test_tensorexpr

coverage report  -m

Report:

Name                                                                                                                  Stmts   Miss  Cover   Missing
---------------------------------------------------------------------------------------------------------------------------------------------------
/opt/conda/lib/python3.6/site-packages/torch/_VF.py                                                                      11      0   100%
/opt/conda/lib/python3.6/site-packages/torch/__config__.py                                                                5      2    60%   9, 17
/opt/conda/lib/python3.6/site-packages/torch/__future__.py                                                                5      2    60%   16, 19
/opt/conda/lib/python3.6/site-packages/torch/__init__.py                                                                248    104    58%   18, 45-129, 135, 162-175, 206-222, 236, 245, 267-269, 291, 306, 312, 397, 401, 487, 534-535
/opt/conda/lib/python3.6/site-packages/torch/_appdirs.py                                                                254    224    12%   59-69, 107-127, 161-193, 225-237, 270-292, 328-349, 383-391, 426-442, 449-453, 457, 462, 467, 472, 477, 482, 487, 498-511, 515-538, 542-565, 568-591, 594-606, 612-643
/opt/conda/lib/python3.6/site-packages/torch/_classes.py                                                                 23     11    52%   6-7, 10-13, 20-22, 26, 46
/opt/conda/lib/python3.6/site-packages/torch/_jit_internal.py                                                           318    191    40%   34-81, 111-127, 134-140, 190-202, 208-213, 223-229, 239-248, 251-252, 260, 447-468, 476-481, 484-487, 491-492, 496, 499, 503-507, 510-513, 533, 536, 581, 589-604, 609-611, 616-618, 623-625, 631-652, 655-661, 667-673, 680-697, 702, 720, 723, 727, 733, 738, 747, 754, 764-765, 769
/opt/conda/lib/python3.6/site-packages/torch/_linalg_utils.py                                                            46     33    28%   13-15, 23-26, 36-40, 48-50, 56-57, 63, 70, 77, 83-88, 95-104
/opt/conda/lib/python3.6/site-packages/torch/_lobpcg.py                                                                 319    296     7%   167-269, 290-309, 312-328, 333-351, 356-357, 366-385, 393, 404-414, 425, 433-470, 476-515, 562-570, 599-636, 663-732, 739
/opt/conda/lib/python3.6/site-packages/torch/_lowrank.py                                                                 81     71    12%   62-82, 126-130, 135-167, 232-271
/opt/conda/lib/python3.6/site-packages/torch/_namedtensor_internals.py                                                   59     46    22%   11-12, 20, 25-33, 37-40, 44, 47-54, 57, 61-62, 69-72, 77-80, 85-95, 128-142
/opt/conda/lib/python3.6/site-packages/torch/_ops.py                                                                     42     10    76%   22-27, 100-106
/opt/conda/lib/python3.6/site-packages/torch/_overrides.py                                                               58     48    17%   37, 167, 703-728, 762-776, 793, 804-835
/opt/conda/lib/python3.6/site-packages/torch/_six.py                                                                     32      7    78%   52-55, 60, 74-75, 78
/opt/conda/lib/python3.6/site-packages/torch/_storage_docs.py                                                            11      2    82%   27-28
/opt/conda/lib/python3.6/site-packages/torch/_tensor_docs.py                                                            386      0   100%
/opt/conda/lib/python3.6/site-packages/torch/_tensor_str.py                                                             237    214    10%   45-70, 75-132, 135, 138-149, 153-161, 165-193, 199-218, 221-242, 245-257, 261-275, 278-367, 370-371
/opt/conda/lib/python3.6/site-packages/torch/_torch_docs.py                                                             258      0   100%
/opt/conda/lib/python3.6/site-packages/torch/_utils.py                                                                  182    152    16%   24-43, 60-77, 81-88, 130-131, 135-141, 157-162, 165-171, 175-177, 181-200, 203-209, 213-217, 225-233, 250-253, 267-269, 285-291, 309-315, 333-337, 352-369, 375-379, 393, 401-405, 411-418
/opt/conda/lib/python3.6/site-packages/torch/_utils_internal.py                                                          31     14    55%   12, 22, 26, 34, 43-55
/opt/conda/lib/python3.6/site-packages/torch/_vmap_internals.py                                                          89     61    31%   75-79, 89-103, 108-110, 115-119, 125-144, 151-163, 171-178, 181-185, 231-248
/opt/conda/lib/python3.6/site-packages/torch/autograd/__init__.py                                                        66     48    27%   26-54, 103-125, 170-190, 210, 214-215, 218
/opt/conda/lib/python3.6/site-packages/torch/autograd/_functions/__init__.py                                              1      1     0%   1
/opt/conda/lib/python3.6/site-packages/torch/autograd/_functions/tensor.py                                               31     31     0%   1-51
/opt/conda/lib/python3.6/site-packages/torch/autograd/_functions/utils.py                                                37     37     0%   1-56
/opt/conda/lib/python3.6/site-packages/torch/autograd/anomaly_mode.py                                                    19      8    58%   71-72, 77, 80, 99-100, 103, 106
/opt/conda/lib/python3.6/site-packages/torch/autograd/function.py                                                       175    106    39%   25, 38, 41, 60, 67-71, 78, 148, 168, 190, 198-234, 249-250, 256-257, 262-275, 285-287, 293-311, 323-337, 358-363, 366-371, 374-376, 381-385, 388-389, 393-394, 397, 400, 403, 406
/opt/conda/lib/python3.6/site-packages/torch/autograd/functional.py                                                     227    212     7%   8-22, 30-39, 49-61, 67-73, 78-89, 95-121, 128-145, 153-188, 248-273, 331-363, 423-461, 536-557, 608-640, 705-742
/opt/conda/lib/python3.6/site-packages/torch/autograd/grad_mode.py                                                       42     23    45%   10, 14-15, 20-30, 64-65, 68, 103-104, 107, 145-146, 149, 152
/opt/conda/lib/python3.6/site-packages/torch/autograd/gradcheck.py                                                      254    238     6%   10-16, 20-33, 37-43, 53-136, 142-177, 181-186, 190, 252-401, 465-491
/opt/conda/lib/python3.6/site-packages/torch/autograd/profiler.py                                                       474    391    18%   10-20, 26-31, 34, 48-98, 102, 106, 121, 137-195, 209-219, 227-232, 301-308, 311-321, 324-331, 334-336, 339-342, 345-347, 350-351, 356-357, 361-362, 366-367, 375-376, 414-416, 419-420, 423-425, 446-453, 538-540, 543-556, 559-563, 572, 580-586, 591-594, 598-608, 627, 631, 636-637, 640, 651-663, 666, 674-675, 681-683, 689-691, 697-699, 705, 709, 713, 716, 741-753, 756-780, 783, 786, 809-810, 817-951, 960, 963-965, 969-1028, 1043-1179
/opt/conda/lib/python3.6/site-packages/torch/autograd/variable.py                                                         9      1    89%   7
/opt/conda/lib/python3.6/site-packages/torch/backends/__init__.py                                                        29     10    66%   14, 19-24, 32, 35-38
/opt/conda/lib/python3.6/site-packages/torch/backends/cuda/__init__.py                                                   45     15    67%   10, 21, 24-26, 36, 47, 67-74, 77, 81
/opt/conda/lib/python3.6/site-packages/torch/backends/cudnn/__init__.py                                                  71     44    38%   9-10, 23-45, 49-51, 63, 67-83, 87-93, 98-105
/opt/conda/lib/python3.6/site-packages/torch/backends/cudnn/rnn.py                                                       33     33     0%   1-58
/opt/conda/lib/python3.6/site-packages/torch/backends/mkl/__init__.py                                                     3      0   100%
/opt/conda/lib/python3.6/site-packages/torch/backends/mkldnn/__init__.py                                                 23     10    57%   8, 11-13, 17-23
/opt/conda/lib/python3.6/site-packages/torch/backends/openmp/__init__.py                                                  3      1    67%   6
/opt/conda/lib/python3.6/site-packages/torch/backends/quantized/__init__.py                                              37     16    57%   9-18, 23-24, 28, 31, 35-36, 39
/opt/conda/lib/python3.6/site-packages/torch/backends/xnnpack/__init__.py                                                17     17     0%   1-25
/opt/conda/lib/python3.6/site-packages/torch/contrib/__init__.py                                                          0      0   100%
/opt/conda/lib/python3.6/site-packages/torch/contrib/_tensorboard_vis.py                                                 74     74     0%   1-141
......
---------------------------------------------------------------------------------------------------------------------------------------------------
TOTAL                                                                                                                 41360  33608    19%

19% coverage seems to be very low. Is this the right way to measure the test coverage?

1 Like

@ezyang Will you be able to help with this? If not, could you point me to the right person?

Hi,

Just as a quick note:
I think a few of these are actually relevant and we should add tests for them.
Note though that removing test_jit and test_multiprocessing will reduce the coverage by quite a bit.
Also for some of the autograd code, this code is mostly called from C and so I think is missed by this kind of tools (we should double check that).
Also I’m very surprised to see things like gradcheck.py in there (252-401). Most of the autograd test suite is actually just calling this function. So maybe the detection is not working great with out test runner?

1 Like

@albanD Thanks for replying. I will enable test_jit and test_multiprocessing and calculate test coverage again. I found test_multiprocessing to be flaky. It randomly fails so had disabled it