I get this error: The size of tensor a (4096) must match the size of tensor b (10694) at non-singleton dimension 1.
Since VAE considers to reduce the features dimension, is possible to solve this error?
It’s not an error but a wrong design from your side.
You need to figure out a system which leads that matches the amount of feats in the output and the input.
Just to add a keyword to what Juan already said, do a search for “downsampling”, as that’s what you’ll need to implement here. Essentially you’ll need to pick some downsample function that reduces 10694 dimensions down to 4096. Conventionally you could use a AvgPool family layer: