I just saw that my conda environment is 13GB in size. I looked up which packages are taking more space and they are mostly pytorch / cuda modules.
I wonder if that’s normal or if there’s any way to avoid using so much disk space.
This is the list of the modules that take more space in my environment and their respective sizes.
pytorch-1.13.0-py3.10_cuda11.7_cudnn8.5.0_0.json: "size": 1229942678,
nsight-compute-2022.3.0.22-0.json: "size": 639598244,
libcublas-dev-11.11.3.6-0.json: "size": 413253930,
libcublas-11.11.3.6-0.json: "size": 381637889,
libcusparse-dev-11.7.5.86-0.json: "size": 377165256,
libcufft-dev-10.9.0.58-0.json: "size": 289240757,
mkl-2021.4.0-h06a4308_640.json: "size": 229783051,
libcusparse-11.7.5.86-0.json: "size": 184891885,
libnpp-11.8.0.86-0.json: "size": 154995740,
libnpp-dev-11.8.0.86-0.json: "size": 151472361,
libcufft-10.9.0.58-0.json: "size": 149741913,
cuda-nvvp-11.8.87-0.json: "size": 119905249,
cuda-nsight-11.8.86-0.json: "size": 119143043,
libcusolver-11.4.1.48-0.json: "size": 101143771,
I used grep '"size":' ${CONDA_PREFIX}/conda-meta/*.json | sort -k3rn | sed 's/.*conda-meta\///g' | column -t
to output the previous list.
Thanks,
B.