How to compute tensor offset averages?

Consider a batch representation of labels (concatenated into a sentence) as shown below.

batch_size = 2 # two set of labels as two sentences
max_length = 64 
rpr_dimension = 768

labels_rpr = torch.rand((batch_size, max_length, rpr_dimension))
labels_rpr # torch.Size([2, 64, 768])
# tensor([[[0.4031, 0.3784, 0.9089,  ..., 0.9796, 0.7666, 0.7783],
#          [0.9969, 0.2751, 0.0540,  ..., 0.8257, 0.4371, 0.2242],
#          [0.0208, 0.9337, 0.7295,  ..., 0.8140, 0.8909, 0.1517],
#          ...,
#          [0.5298, 0.3919, 0.6242,  ..., 0.1503, 0.9446, 0.7762],
#          [0.3007, 0.4251, 0.6003,  ..., 0.1441, 0.0233, 0.0407],
#          [0.3498, 0.3891, 0.0787,  ..., 0.1565, 0.0419, 0.7865]]])

Also, consider the labels_offset, showing the start and end of each label as shown below:

labels_offset # torch.Size([2, 64])
# tensor([[ 1,  3,  5,  6,  7,  8, 11, 15, 16, 19, 21, 22, 23, 24, 28, 29, 30, 33,
#          34, 36, 37, 38, 39, 40, 44, 45, 47, 51,  0,  0,  0,  0,  0,  0,  0,  0,
#           0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,
#           0,  0,  0,  0,  0,  0,  0,  0,  0,  0],
#         [ 1,  5,  6,  8, 11, 12, 13, 14, 15, 16, 18, 22, 28, 29, 30,  0,  0,  0,
#           0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,
#           0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,
#           0,  0,  0,  0,  0,  0,  0,  0,  0,  0]])

Accordingly, in the first batch sample, we have the representation of the first label in the range [1 3) and the last label in the range [47 51).

Then, is there an approach to compute each label’s representation as the average of the vectors in each labels_offset interval?