Learning of data with more than 30,000 features

I am considering a model that takes the expression levels of all genes as input. However, since the number of features of data is about 30,000 (30,000 genes in humans), it is questionable whether it can be used as it is. Do I need a different embedding method? I’ve never experienced something like this case.