Torch.cat in backpropagate

Hi
I was testing how torch.cat will affect the backpropagated loss. I write this piece of code but I’m not sure why no gradient will reach the inputs. any clue what might be wrong ?

x = Variable(torch.tensor([1.]), requires_grad=True)
y = Variable(torch.tensor([1.]), requires_grad=True)
target = Variable(torch.tensor([3., 3.]))
z = torch.cat([x,y])
loss = Variable( torch.mean(z - target), requires_grad = True)
loss.backward()

now I’m checking the gradient of each one of the variables x, and y but their Values are None !! what is going on

x.grad == None
True
y.grad == None
True
x = Variable(torch.tensor([1.]), requires_grad=True)
y = Variable(torch.tensor([1.]), requires_grad=True)
target = Variable(torch.tensor([3., 3.]))
z = torch.cat([x,y])
z.retain_grad()
loss = Variable( torch.mean(z - target), requires_grad = True)
loss.backward()