Can transformer automatically learn the length of sequences?

I am new to transformers and I need to use transformer to predict a score based on a input sequence of variable length.

Due to the nature of my use case, longer sequnences tend to provide higher score for regression, although it is not guaranteed. However, my pytorch experiment only learns to recognize the sequence patterns which can lead to a sudden increase of the score, but it doesn’t seem to learn the fact that “Longer sequence tends to provide higher scores”. So my question is whether transformer can automatically learn the length of the sequence or I need to explictly feed the length of the sequence as an additional input to the network?

My current implementation is like this:

import torch.transformer as transformer
input_seq = …
score = transformer(input_seq)