VAE example: UserWarning on `binary_cross_entropy`

I’m using torch of version 1.3.1+cpu and torchvision version of 0.4.2+cpu.

I’m using the VAE example in master ( 0c1654d) but the interpreter results in this warning:

/home/gon1332/Development/Training/ML/learning-data-augmentation/model.py:147: UserWarning: Using a target size (torch.Size([128, 784])) that is different to the input size (torch.Size([128, 20])) is deprecated. Please ensure they have the same size.
  BCE = F.binary_cross_entropy(recon_x, x.view(-1, 784), reduction='sum')
Traceback (most recent call last):
  File "main.py", line 38, in <module>
    main()
  File "main.py", line 34, in main
    ...
    BCE = F.binary_cross_entropy(recon_x, x.view(-1, 784), reduction='sum')
  File "-/.local/lib/python3.6/site-packages/torch/nn/functional.py", line 2058, in binary_cross_entropy
    "!= input nelement ({})".format(target.numel(), input.numel()))
ValueError: Target and input must have the same number of elements. target nelement (100352) != input nelement (2560)

x and x_recon have the below sizes:

x_recon.size() = torch.Size([128, 20])
x.size() = torch.Size([128, 1, 28, 28]) and with x.view(-1, 128) torch.Size([128, 784])

The printed shapes correspond to the shape mismatch mentioned in the error message.
How would you like to compute the binary cross entropy with a different number of samples?

Did you change anything in the example code?
I just retried it with the latest nightly build and it seems to work without any errors.