Hmmm. The commented code uses relu, and it worked. I was trying to substitute instead SELU. They look to me to be the same syntax but that doesn’t work with SELU? Can you please rewrite this the way that would work? Should I use it from the functional module rather than from nn?
def forward(self, state):
"""Build an actor (policy) network that maps states -> actions."""
# x = F.relu(self.bn1(self.fc1(state)))
# x = F.relu(self.fc2(x))
x = nn.SELU(self.bn1(self.fc1(state)))
x = nn.SELU(self.fc2(x))
return torch.tanh(self.fc3(x))