Build_vocab function does not take the argument "specials"

text.build_vocab(ds_train, max_size=num_words, specials=[’’,’’])


Full code: https://github.com/Atcold/pytorch-Deep-Learning/blob/master/15-transformer.ipynb
Pytorch version: 1.6.0+cu101

If you don’t mind, I want to ask how to do a mask token in it. I see some transformer model that is masking some word and fill it in later.