Vocab in torchtext 0.10.0: no max_size?

So torchtext 0.10.0 saw Vocab undergo significant changes. New Vocab class is:
torchtext.vocab.Vocab(vocab)
where vocab is
torchtext.vocab.vocab(ordered_dict: Dict, min_freq: int = 1) → torchtext.vocab.Vocab

In previous versions (up until 0.9) the Vocab class was taking the following arguments upon initialisation:
counter, max_size=None, min_freq=1, specials=('<unk>', '<pad>'), vectors=None, unk_init=None, vectors_cache=None, specials_first=True

While I can see most of the functionality repeated in the new class as methods, I could not figure out how to constrain the max_size in the new class. There is no parameter or method which does this. Am I missing something? Should/will this be implemented in future updates?

1 Like