DataParallel memory consumption in PyTorch 0.4

C:\Anaconda3\lib\site-packages\torch\cuda\nccl.py:24: UserWarning: PyTorch is not compiled with NCCL support
  warnings.warn('PyTorch is not compiled with NCCL support')
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-1-fdbbe4fefe36> in <module>()
      1 import torch
      2 print(torch.cuda.nccl.is_available(torch.randn(1).cuda()))
----> 3 print(torch.cuda.nccl.version())

C:\Anaconda3\lib\site-packages\torch\cuda\nccl.py in version()
     29 
     30 def version():
---> 31     return torch._C._nccl_version()
     32 
     33 

AttributeError: module 'torch._C' has no attribute '_nccl_version'```

Installed pytorch using conda method,  from the main site Windows -> conda -> 3.6 -> cuda 8.0

Edited: Just read the replies from peterjc and OP, it makes sense now.