Missing torch.nn.utils.clip_grad_norm in Anaconda-based pytorch Installation!

Hi there,
I’ve have installed Pytorch based on this command: [conda install pytorch torchvision -c soumith]. In the DOC page of pytorch there are some functions related to torch.nn.utils. For example:

  1. torch.nn.utils.clip_grad_norm()
  2. torch.nn.utils.rnn.PackedSequence()

My question is here, when I am going to access to these functions (especially clip_grad_norm), the Anaconda says me:

AttributeError: module ‘torch.nn.modules.utils’ has no attribute 'clip_grad_norm.

So, I would like to know how can I repair this error? I think somehow the installation procedure, above mentioned command, is a little incomplete!


If you want to use latest version of PyTorch, you should install from source.

Below is an example for installing PyTorch from source.

$ git clone https://github.com/pytorch/pytorch
$ export CMAKE_PREFIX_PATH=/home/yunjey/anaconda3   # your anaconda path
$ conda install numpy mkl setuptools cmake gcc cffi
$ conda install -c soumith magma-cuda80
$ pip install -r requirements.txt
$ python setup.py install
1 Like

Thanks, problem was solved.

We’ll be publishing new binaries soon and they will include that function.

1 Like

Thank you @apaszke . I am eagerly waiting for your new release!

Could you please tell me when you are going to publish new binaries? :slight_smile: :sunglasses:

We’re working on fixing some issues with compilers over optimizing the binaries with AVX2 vector instructions, that are not available in lots of older CPUs. We’re going to publish them once we’re done with that.