ValueError: Layer weight shape (10500, 768) not compatible with provided weight shape (9, 768)
embedding_matrix = torch.stack(embeddings)
np_arr = embedding_matrix.cpu().detach().numpy()
I got from
embedding_matrix for each sentence the shape like
torch.Size([6, 768]) then I tried to convert it into an
array to be passed to the model but got the above problem.
I need to sum all values I got from
embedding_matrix then convert it into an array to become in this shape
@Sammo1 This is not how you should use embedding. You will have to revisit your architecture
excuse me i didn’t get what do you mean ?
Are you trying to pass a numpy array to a PyTorch model? This won’t work, as tensors are expected.
Could you share more information about your use case and how to reproduce the issue?
i need to pass it to keras model but the calculations used pytoch for extract embedding so i need to pass the result as array . the problem is to concatenate every shape i got from embedding_matrix as every sentence store in embedding_matrix alone i need to sum every shape i got like ([9+8+13+10],768)