Hi every one.

I am trying to create an AE and insert a neuron after encode part. I can easily do it on the forward part by concatenating a new neuron in the encoded data and fitting it into decoder. However, the backward didn’r allow me to do it, given mismatching dismensions.

## Creating a PyTorch class

class AE(torch.nn.Module):

definit(self):

super().init()

self.encoder = torch.nn.Sequential(

torch.nn.Linear(9351, 128),

torch.nn.Linear(128, 9)

)`self.decoder = torch.nn.Sequential( torch.nn.Linear(10, 18), torch.nn.Linear(18, 9351), torch.nn.Sigmoid() ) def forward(self, x ,x2): encoded = self.encoder(x) encoded.data = torch.FloatTensor(np.concatenate((encoded.data, x2),axis = 0) ) decoded = self.decoder(encoded) return decoded`

model = AE()

## Validation using MSE Loss function

loss_function = torch.nn.MSELoss()

## Using an Adam Optimizer with lr = 0.1

optimizer = torch.optim.Adam(model.parameters(), lr = 1e-1, weight_decay = 1e-8)

epochs = 10

outputs = []

losses = []

torch.autograd.set_detect_anomaly(True)for epoch in range(epochs):

for i,image in enumerate(loader):

x2 = [temp[i]]

reconstructed = model(image,x2)

optimizer.zero_grad()

optimizer.step()

loss = loss_function(reconstructed, image)

loss.backward(retain_graph=True,inputs = reconstructed)

losses.append(loss)

RuntimeError: Function UnsqueezeBackward0 returned an invalid gradient at index 0 - got [10] but expected shape compatible with [9]

any help will be very appreciated!