I have defined the hidden layers of my sequential model using the following code:
self.hiddenlayers = nn.GRU(input_size, hidden_size, num_layers=num_layers, batch_first=True)
When I try to pass input through that code as such:
x = F.relu(self.hiddenlayers(x))
I get the following error:
TypeError: relu(): argument ‘input’ (position 1) must be Tensor, not tuple
The problem seems to me that when I pass x through the hidden layers it returns a tuple instead of a tensor that I can pass to the relu function. How do I convert the self.hiddenlayers(x) to a tensor that I can pass to relu()?.
I am trying to create a stacked gru where each layer has a activation function of ReLU. Is this approach a correct method of accomplishing that?