combining TEXT.build_vocab with BERT Embedding

Questions and Help

Description

Hi, we can use glove embedding when building vocab, using
something like:

MIN_FREQ = 2

TEXT.build_vocab(train_data, 
                 min_freq = MIN_FREQ,
                 vectors = "glove.6B.300d",
                 unk_init = torch.Tensor.normal_)

However, I want to use BERT embedding because I need a sophisticated model to compare the performance of multiple embeddings. How can I use BERT in build_vocab?