Siamese Network training

I am looking for a way to train a Correlation based Siamese Network, while training i get an error saying that the target size must be the same as the input size. I wanted to ask that is there a way to set up my labels for the training data in a way that it would match the input size.

Could you post some information regarding your model architecture, used loss function, etc.
I’m currently not sure, what your use case exactly is and where the error is being thrown.

model1.train()
model2.train()

batch_size = 5
epochs = 2

for epoch in range(epochs):
for i in range(0, len(X1), batch_size):
batch_X1 = X1[i:i+batch_size].view(-1, 1, 300, 300)
batch_X2 = X2[i:i+batch_size].view(-1, 1, 150, 150)
batch_y = y[i:i+batch_size]
#print(“check”)
optimizer1.zero_grad()
optimizer2.zero_grad()
#print(“check”)

    feature1 = model1(batch_X1)
    feature2 = model2(batch_X2)
    
    b, c, h, w = feature1.shape
    print("cc:",feature1.view(1,b*c,h,w).shape)
    
    corr_map = F.conv2d(feature1.view(1,b*c,h,w), feature2, groups=b)
    #print("check")
    #loss1 = loss_function(feature1,feature2)
    loss = loss_function(corr_map, batch_y)
    
    #loss = loss1 + loss2
    #print("check")
    loss.backward()
    optimizer1.step()
    optimizer2.step()
    
ep = epoch+1
print(f"Epoch: {ep}. Loss: {loss}")

this is the training code. In this code the shape of corr_map is [1,5,38,38] and the size of batch_y is [1] and i am not able to calculate the loss between them because the size is not same. I wanted to know if there is a way i could overcome this problem. If there is a way through which the size of the target equals the size of the input plz enlighten me.

What loss function are you using and how would you like to calculate the loss between a 4-dimensional tensor and a scalar?