Index out of range in self

In the embedding layer of the model I have defined vocab size and a embedding_dim argument, but while printing the summary of the model I’m not getting the exact value of the arguments as set in hyperparameters of the above arguments.

The embedding_dim is set to 512 as of the same length of sequences. I have attached the code and error snippets for better understanding.

Model Architecture


While declaring the class the order in which you are passing self, and other parameters and the order in which you are actually passing at time of declaration should be in same order or else it should be passed like <parameter_name>=<parameter_variable>

So one way is

network = RNNClassifier(embedding_dim, hidden_dim, layer_dim, vocab_size, output_dim)

1 Like

ohhh, yes. I havent seen that. Very embarrassing. Thank you!

However, I’m facing this other error

embedding(): argument ‘indices’ (position 2) must be Tensor, not tuple

I have converted the max_features to tensor type using torch.tensor but still the same error occurs. This max_features is the total vocab_size which I’m passing to the embedding layer.
Please, help me figure out this issue as well.