Compressing Emebedding Size

I have a question for all of you. In NLP, say you use some embedding method. and say for a paragraph you get say 768*n matrix, where n is dependent on, let’s say the number of words or sentences or context window of words
And you want to work in lower dimensions. What are some ways to reduce this dimensionality, one way is to take mean across each axis to reduce it to a 768 dimension vector?