[announcement] those of you who use the master branch, breaking changes incoming

Dear PyTorch users,

Most of you use our stable releases. Our current stable release is v0.1.2

However, some of you use the master branch of PyTorch.
We wanted to give those of you who use the master branch a heads-up about some breaking changes that will be merged starting today.
These breaking changes are because we will be introducing NumPy-like Broadcasting into PyTorch (See PR#1563).
We will be releasing a comprehensive set of backward-compatibility warnings and codemod mechanisms in v0.2 to detect code that will change behavior, so that you can be aware of and fix it.
However, these warnings will only be available when we release v0.2 one or two weeks from now.

In this small window of time, our master branch will change behavior.

We hope this does not cost too much trouble to you,

Best,
The PyTorch Team

19 Likes

You mean that we will be released from torch.expand_as() and we will directly sum two dimension-compatibles tensors ?

3 Likes

yes, you can directly do broadcasting, instead of having to expand/expand_as all the time.

10 Likes

We hope this does not cost too much trouble to you,

In the long run, this’ll make life that much easier.

Note that in the vast majority of cases, it won’t be breaking at all. Previously we accepted addition of tensors with shapes (4, 1) and (1, 4), which would give a (4, 1) tensor as a result. However, after these changes the inputs are going to get broadcast and output a (4, 4) tensor. So, as long as you don’t depend on this, and all shapes in your program were matching, the new changes won’t change the behavior at all. If we detect that you depended on it, a warning will be raised.

Hello @apaszke, can you have a look at https://discuss.pytorch.org/t/backward-error-when-using-expand/4273. Is this problem also related to this issue ?

When is the expected release date of v0.2 ?

4 Likes

Any updates on the expected release date of v0.2?

New release is out.
https://discuss.pytorch.org/t/v0-2-higher-order-gradients-distributed-pytorch-broadcasting-advanced-indexing-new-layers-and-more

2 Likes