Newbie question about autograd

What kind of operations can I perform on tensors while still being able to compute gradients? For instance when I try

x = torch.rand(1, requires_grad=True)
y = x+2
print(y.grad_fn)

I get <AddBackward0 object at 0x7fd459552320>

but when I try something like

x = torch.rand(1, requires_grad=True)
y = torch.tensor([x , 0])
print(y.grad_fn)

I get None.

Recreating a tensor will break the computation graph.
If you would like to concatenate two tensors, you should use torch.cat or torch.stack instead:

x = torch.rand(1, requires_grad=True)
y = torch.cat([x , torch.tensor([0.])])
print(y.grad_fn)
<CatBackward object at 0x7fdb08dca6d8>