Output of my network is 0 for some reason

        self.linear_right = torch.nn.Sequential(
            torch.nn.Linear(20, (feature_dim + 20) // 2),
            torch.nn.ReLU(),
            torch.nn.Linear((feature_dim + 20) // 2, feature_dim),
            torch.nn.ReLU6()
        )
...
        mean_out = torch.mean(down_out, 2)

        linear_out = self.linear_right(mean_out)
        print('mean')
        print(mean_out)
        print(self.linear_right[2].weight)

I have something like that outputs:

mean
tensor([[0.00e+00, 0.00e+00, 3.66e+01, 0.00e+00, 0.00e+00, 0.00e+00, 7.40e+01, 5.98e+01, 5.52e+01, 0.00e+00,
         4.82e+00, 0.00e+00, 7.89e+00, 0.00e+00, 0.00e+00, 1.05e+00, 1.77e+01, 0.00e+00, 3.64e+01, 0.00e+00]],
       grad_fn=<MeanBackward1>)
Parameter containing:
tensor([[1.79e-02, 7.82e-02, -1.99e-02,  ..., 4.72e-02, 3.25e-03, 3.68e-02],
        [1.23e-01, 7.37e-02, -4.71e-02,  ..., 1.88e-02, -9.74e-02, -3.74e-02],
        [8.93e-02, 7.38e-02, -4.53e-02,  ..., 2.13e-02, -1.54e-02, -3.55e-02],
        ...,
        [-9.18e-02, 2.46e-02, 6.96e-02,  ..., -4.98e-02, -9.11e-02, -2.09e-02],
        [-4.82e-02, -9.72e-02, 8.65e-02,  ..., -6.63e-02, -5.12e-02, -6.78e-02],
        [3.11e-03, 1.28e-02, -6.40e-02,  ..., 4.17e-03, 5.25e-02, -1.37e-02]],
       requires_grad=True)
tensor([[0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
         0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
         0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
         0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
         0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
         0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
         0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]],
       grad_fn=<HardtanhBackward0>)

So the output of the linear_out = self.linear_right(mean_out) is all zeros, but the input isn’t and the weights aren’t. So why is it?

The last ReLU6 activation might return all zeros, if input activation is negative.
Could you remove it for the sake of debugging and check the output again?