For example, if I have a tensor a=[0.4, 0.1, 0.2, 0.1, 0.5], I want to apply softmax in top 3 elements, and let other elements become 0, for tensor a, that is [0.3420, 0.0, 0.2800, 0.0000, 0.3780], but when I use pytorch to implement this method I will get this error message in loss.backward() “RuntimeError: leaf variable has been moved into the graph interior”

Here is my code:

```
def new_softmax(raw, k):
values, idxs = raw.topk(k)
all = sum(torch.exp(values))
for i in idxs:
raw[i] = torch.exp(raw[i]) / all
one_hots = [int(j in idxs) for j in range(len(raw))]
return raw * torch.tensor(one_hots).float()
a = torch.tensor([0.4, 0.1, 0.2, 0.1, 0.5], requires_grad=True)
b = new_softmax(a, 3)
sum(b).backward()
```