ValueError: expected sequence of length 82 at dim 1 (got 63)

I am getting this error:

ValueError: expected sequence of length 82 at dim 1 (got 63)

and I guess this part is not working properly:


def data_process(data, labels):
input_ids =
attention_masks =
MAX_SEQUENCE_LENGTH = 250
bert_tokenizer = BertTokenizer.from_pretrained(“bert-base-uncased”)
for sentence in data:
bert_inp = bert_tokenizer.call(sentence, max_length=MAX_SEQUENCE_LENGTH,
padding=‘max_length’, pad_to_max_length=True,
truncation=True, return_token_type_ids=False
)# ,add_special_tokens=True , return_length=True,
#return_tensors=“pt”
input_ids.append(bert_inp[‘input_ids’])
attention_masks.append(bert_inp[‘attention_mask’])

input_ids = np.asarray(input_ids, dtype=object)#
attention_masks = np.array(attention_masks, dtype=object)#
labels = np.array(labels)#

return input_ids, attention_masks, labels

This is error line:

I get the following error:


ValueError Traceback (most recent call last)
in
340 # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Tokenization ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~#
341 # for train set
→ 342 train_seq = torch.tensor(train_text[‘input_ids’].tolist())#, dtype=torch.int64
343 train_mask = torch.tensor(train_text[‘attention_masks’].tolist(), dtype=torch.int64)#
344 train_y = torch.tensor(train_labels.tolist(), dtype=torch.int64)

ValueError: expected sequence of length 82 at dim 1 (got 63)


What am I doing wrong?