Gradient calculation with symmetric matrix

Hi,
I am trying to compute the gradient of output from model with respect to the input, which is a symmetric matrix. I expect the output of gradient calculation to be symmetric. Can someone help explain why the gradient output is not symmetric? This is the code I have been trying.

SM = Variable(matrix,requires_grad=True)
out = model(SM)
output = F.softmax(out)
model.zero_grad()
out_class = output.max(1)[1].data[0]
one_hot_output = torch.zeros(output.size())
one_hot_output[0, output_class] = 1
output.backward(gradient=one_hot_output)
grad_out_in = SM.grad.data[0]

The output of above code is for example looks like this:
tensor([[[-0.0465, -0.0377, -0.0602, …, 0.0117, 0.0323, -0.0426],
[ 0.0018, -0.0711, 0.0594, …, 0.0302, -0.1398, 0.0213],
[-0.0123, 0.0490, 0.0875, …, -0.0314, 0.0194, -0.0434],
…,
[ 0.0363, 0.0121, 0.0098, …, 0.0108, 0.0057, 0.0425],
[ 0.0596, -0.0743, 0.0541, …, 0.0290, -0.1835, 0.1229],
[-0.0395, 0.0283, -0.0428, …, -0.0294, 0.1277, -0.0793]]]

Using Pytorch 1.4 with cuda 9.2

Hi,

Why do you expect the gradient to be symmetric?
If I get a matrix, and pass it through a function that only takes the first row and sum it. The gradient will be 1s in the first row. Whether the input matrix is symmetric or not does not influence that.