IndexError: index out of range in self, Positional Embedding

I’m doing Convolution Seq-Seq Learning, following this notebook: https://github.com/bentrevett/pytorch-seq2seq/blob/master/5%20-%20Convolutional%20Sequence%20to%20Sequence%20Learning.ipynb
While training I get this error:
IndexError Traceback (most recent call last)
in ()
8 start_time = time.time()
9
—> 10 train_loss = train(model, train_iterator, optimizer, criterion, CLIP)
11 valid_loss = evaluate(model, valid_iterator, criterion)
12

7 frames
in train(model, iterator, optimizer, criterion, clip)
12 optimizer.zero_grad()
13
—> 14 output, _ = model(src, trg[:,:-1])
15
16 #output = [batch size, trg len - 1, output dim]

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
1100 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1101 or _global_forward_hooks or _global_forward_pre_hooks):
→ 1102 return forward_call(*input, **kwargs)
1103 # Do not call functions when jit is used
1104 full_backward_hooks, non_full_backward_hooks = [], []

in forward(self, src, trg)
15 #encoder_combined is encoder_conved plus (elementwise) src embedding plus
16 # positional embeddings
—> 17 encoder_conved, encoder_combined = self.encoder(src)
18
19 #encoder_conved = [batch size, src len, emb dim]

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
1100 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1101 or _global_forward_hooks or _global_forward_pre_hooks):
→ 1102 return forward_call(*input, **kwargs)
1103 # Do not call functions when jit is used
1104 full_backward_hooks, non_full_backward_hooks = [], []

in forward(self, src)
48 tok_embedded = self.tok_embedding(src)
49 print(pos.shape)
—> 50 pos_embedded = self.pos_embedding(pos)
51
52 #tok_embedded = pos_embedded = [batch size, src len, emb dim]

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
1100 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1101 or _global_forward_hooks or _global_forward_pre_hooks):
→ 1102 return forward_call(*input, **kwargs)
1103 # Do not call functions when jit is used
1104 full_backward_hooks, non_full_backward_hooks = [], []

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/sparse.py in forward(self, input)
158 return F.embedding(
159 input, self.weight, self.padding_idx, self.max_norm,
→ 160 self.norm_type, self.scale_grad_by_freq, self.sparse)
161
162 def extra_repr(self) → str:

/usr/local/lib/python3.7/dist-packages/torch/nn/functional.py in embedding(input, weight, padding_idx, max_norm, norm_type, scale_grad_by_freq, sparse)
2042 # remove once script supports set_grad_enabled
2043 no_grad_embedding_renorm(weight, input, max_norm, norm_type)
→ 2044 return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
2045
2046

IndexError: index out of range in self

Can anyone help me resolve this please

An nn.Embedding layer expects inputs to contain indices in the range [0, num_embeddings] while your input seems to contain indices which are out of bounds. Check the min and max values of your input and make sure they are in the aforementioned range.