Autograd backward with a matrix


I’m new in pyorch and I have a little question.
we use pytorch for backpropagation, we usually do: loss.backward(). And loss is a scalar. But what if my loss is not a scalar? Now I have a 54*3840 matrix, and I want use like Jacob matrix backpropagation. Do you know how can do it?