Model parameters were cut off with concatenation in Pytorch 0.3.1

Hi,

First, you should upgrade your version of pytorch :wink:
Second, .data should not be used anymore and should never be used in a network’s forward pass if you want gradients.

What it does (in a very unsafe way) is loose track of the history of a tensor. In you case, the new x is not linked to the original one. And so whatever you do with the new x won’t compute gradients for the old one.

You should remove the .data (and in other places in your code as well!).