[v0.2] Higher order gradients, Distributed PyTorch, Broadcasting, Advanced Indexing, New Layers and more

Here comes the next major release of PyTorch, just in time for ICML.
Install it today from our website http://pytorch.org

We’re introducing long-awaited features such as Broadcasting, Advanced Indexing, Higher-order gradients and finally: Distributed PyTorch.

Due to introducing Broadcasting, the code behavior for certain broadcastable situations is different from behavior in 0.1.12. This might lead to silent bugs in your existing code. We’ve provided easy ways of identifying this ambiguous code in the Important Breakages and Workarounds section.

Our detailed release notes cover the following topics.
Please read them here: https://github.com/pytorch/pytorch/releases/tag/v0.2.0

  • Tensor Broadcasting (numpy-style)
  • Advanced Indexing for Tensors and Variables
  • Higher-order gradients
  • Distributed PyTorch (multi-node training, etc.)
  • Neural Network layers and features: SpatialTransformers, WeightNorm, EmbeddingBag, etc.
  • New in torch and autograd: matmul, inverse, etc.
  • Easier debugging, better error messages
  • Bug Fixes
  • Important Breakages and Workarounds

Package documentation is available here: http://pytorch.org/docs/0.2.0/

Discuss the release here!

Cheers,
The PyTorch Team

19 Likes

Is beta considered over now?

0.xxx are all not production ready. when we are confident of quality, we’ll transition to 1.0, starting which we shall declare beta to be over.

5 Likes

Did you prepare the binary file for conda?

yes, of course. Please use the commands on our website.

1 Like

Thank dear @smth. Feeling amazed :heart_eyes::heart_eyes::heart_eyes:

Broadcasting is very helpful, thank you all for great PyTorch.

I am looking at Distributed communication package - torch.distributed — PyTorch master documentation, I will have to decay my thesis work once again and spend all my days to try this new toys.

1 Like