Hello. I’m using skip connections in a VAE:
def _encode(self, x):
res1e = x
lin1 = self.relu(self.lin_bn1(self.linear1(x)))
lin2 = self.relu(self.lin_bn2(self.linear2(lin1)))
lin2 = lin2 + self.skip(res1e)
I get this error: The size of tensor a (4096) must match the size of tensor b (10694) at non-singleton dimension 1.
Since VAE considers to reduce the features dimension, is possible to solve this error?
It’s not an error but a wrong design from your side.
You need to figure out a system which leads that matches the amount of feats in the output and the input.
Just to add a keyword to what Juan already said, do a search for “downsampling”, as that’s what you’ll need to implement here. Essentially you’ll need to pick some
downsample function that reduces 10694 dimensions down to 4096. Conventionally you could use a
AvgPool family layer:
lin2 = lin2 + downsample(res1e)
Good! Thank you very much.