Doing that is fine, it would be:
b = nn.MSELoss()(output_x, x_labels)
a = nn.CrossEntropyLoss()(output_y, y_labels)
loss = a + b
loss.backward()
Note the additional parentheses, as James mentioned above.
This is equivalent to:
b = nn.MSELoss()
a = nn.CrossEntropyLoss()
loss_a = a(output_x, x_labels)
loss_b = b(output_y, y_labels)
loss = loss_a + loss_b
loss.backward()