Hello there,

I’m currently trying to implement a VAE for dimensionality reduction purposes.

As a base, I went on from pytorchs VAE example considering the MNIST dataset.

My own problem however, does not rely on images, but on a 17 dimensional vector of continuous values.

I want to use the VAE to reduce the dimensions to something smaller. Additionally, I use a “history” of these values, to transport information about previous values into the network.

So, overall i get an input for my network, which is of the following shape:

[BATCH_SIZE x HISTORY_LENGTH x FEATURE_SIZE]

In a small example I used the following:

[64 x 3 x 17]

Long story short, when it comes to the calculation of the loss function (I started from the pytorch example https://github.com/pytorch/examples/blob/master/vae/main.py#L72) I get into some problems.

Due to the fact that I don’t use images as input, I changed the binary cross-entropy to normal cross-entropy, utilizing `F.cross_entropy(reconstructed_x, x, reduction='sum')`

instead of `F.binary_cross_entropy(reconstructed_x, x, reduction='sum')`

and now I get the following error message:

`ValueError: Expected target size (64, 17), got torch.Size([64, 3, 17])`

Can somebody tell me what I am missing?

Thanks in advance!