Layers in BERT problem

i have a problem in BERT with getting the meaning of encoded_layers[11] while I got the first element is the hidden state of the last layer of the Bert model got by encoded_layers = outputs[0], so what does those mean

layers,_ = BERTModel(tokens_tensor)
embedding = layers[11].squeeze(0)