 # How to detach() a rows of a tensor?

Is it possible to detach a row or few elements in a tensor?? I tried this

``````>>> a = torch.rand((5,3),requires_grad = True)
>>> a = a.detach()
True
True
``````

I don’t think it’s possible. Could you just split the tensor and use the parts:

``````a = torch.randn(5, 3, requires_grad=True)
b = a.detach()
a = a[[0, 1, 3, 4]]
``````

or could you explain your use case a bit more?

I found that even though a.requires_grad = True, the gradient doesn’t go through a. Maybe this is a bug? More generally, if you want detach() an arbitrary part of a tensor, you can repeat the tensor to 2 copies of it, apply detach() to the second copy and use torch.gather to the repeated tensor as a whole to obtain the desired tensor.

1 Like

Strictly speaking this is no “detach” but it this way you can just set them to 0 with hooks:

``````import torch

x[inds] = 0
return x

inds = [2, 4]  # rows which will have zero gradients
b = 2 * a

loss = torch.sum(b**2)
loss.backward()

``````a.grad tensor([[8., 8., 8.],