How to get gradients of different loss once?

I want to get gradient of more than one loss, and I tried autograd.grad,

import torch
x = torch.ones(5, requires_grad=True)
y = torch.ones(5, requires_grad=True)
def l2(x):
    return (x*x).sum()
def l3(x):
    return (x*x*x).sum()
loss1 = l2(x)
loss2 = l3(x)
torch.autograd.grad((loss1,loss2), x)

but I get

(tensor([5., 5., 5., 5., 5.]),)

which means pytorch gets gradients of loss1+loss2 , ranther loss1 and loss2.

While I have an idea to get both gradient through code below, but need two networks perhaps.

import torch
x = torch.ones(5, requires_grad=True)
y = torch.ones(5, requires_grad=True)
def l2(x):
    return (x*x).sum()
def l3(x):
    return (x*x*x).sum()
loss1 = l2(x)
loss3 = l3(y)
torch.autograd.grad((loss1,loss3), (x, y))
(tensor([2., 2., 2., 2., 2.]), tensor([3., 3., 3., 3., 3.]))

Does anyone have good idea to get gradients of different loss through only one variable? Thanks!