Alternative for keras Tokenizer in pytorch

I wanted to have text to sequences so that i can feed to network,
so are there any Pytorch alternative for doing this

tokenizer = Tokenizer()
tokenizer.fit_on_texts(x_train)
x_train = tokenizer.texts_to_sequences(x_train)
#[[1,3,32],[2,34, 1].......