ValueError: Layer weight shape not compatible

excuse me I got this error
ValueError: Layer weight shape (30522, 768) not compatible with provided weight shape torch.Size([1, 15, 768])
while using bert as embedding layer as
embed = Embedding(30522, 768, weights=[embedding_matrix], input_length=max_length , trainable=False)(inputs2)

embedding_matrix from

 input_ids =torch.tensor(tokenizer.encode(sentence)).unsqueeze(0)
 outputs = BERT(input_ids)
 embedding_matrix = outputs[0]

The internal weight matrix of an nn.Embedding layer will have the shape [num_embeddings, embedding_dim], which is [30522, 768] in your case. The error is raised since you are trying to pass a 3D tensor in the shape [1, 15, 768] to this layer as its weight. Based on your code it seems you want to use the output activation of a model as the embedding weight, so maybe try to remove the batch dimension and change num_embeddings to 15.

I don’t know how BERT can be used as an embedding layer, but in your previous code snippet you were trying to assign the model output as the weight parameter to an embedding layer, which sounds strange at least.
An embedding layer can be seen as a lookup table, which accepts indices (i.e. sparse inputs) and maps it to feature tensors (i.e. dense outputs).
Your current code doesn’t seem to be using PyTorch so I don’t know if it could work or not.

I’m not familiar with your use case and don’t know how the model outputs can be used as an embedding as asked in my previous post.