I am new to pytorch. I am trying to implement a simple model with the embedding layer.
I am using the SMS spam dataset.
I have already converted and padded the text data into indexes and built a bacth_generator function which yield batches of targets and inputs in tensor.
when I am trying run the training loop, it said:
ValueError: Using a target size (torch.Size([1])) that is different to the input size (torch.Size([1, 83, 7405])) is deprecated. Please ensure they have the same size.
It seems the size of the predictions from the model is not the same as the size of the target batches. Hence, it cannot calculate the loss.
What should I do if they are the same size?
Model:
class NLP(nn.Module):
def init(self, embedding_size=50, vocab_size=vocabSize):
super(NLP, self).init()
self.embeddings = nn.Embedding(vocabSize, embedding_size)
self.linear1 = nn.Linear(embedding_size, 100)
def forward(self, inputs):
lookup_embeds = self.embeddings(inputs)
out = self.linear1(lookup_embeds)
out = F.log_softmax(out)
return out
Training loop:
losses = []
loss = nn.BCELoss()
model = NLP(vocab_size=vocabSize, embedding_size=50)
optimizer = optim.SGD(model.parameters(), lr=0.001)
for epoch in range(10):
total_loss = 0
for l, t in bacth_generator(train, 32):
model.zero_grad()
prediction = model(t)
output = loss(prediction, l)
ValueError: Using a target size (torch.Size([32])) that is different to the input size (torch.Size([32, 83, 100])) is deprecated. Please ensure they have the same size.