Index the tensor by float point value

Can we index a tensor in PyTorch by float point value to keep the floatTensor’s gradient?

No, you cannot use floating point type as indices.
Depending on your use case you might want to interpolate the original tensor.

What actually I want to is that: Initializing an all-one mask, divide the mask into two parts with high along the radius direction, one part keeps all one and the other part is set to zero, and my code is as followed:
h = nn.Parameter(h, requires_grad=True)

    t_w, t_h = torch.tensor(float(w)), torch.tensor(float(h))
    max_radius = torch.sqrt(t_w * t_w + t_h * t_h)
    high = h * max_radius
    X, Y = torch.meshgrid(torch.linspace(0, h - 1, h), torch.linspace(0, w - 1, w))
    **D = torch.sqrt(X * X + Y * Y)**

    mask = torch.ones((64, 64))
    **mask[(D < high)] = 0**

and I want to get the mask mentioned above, but it seems that the grad from mask to high is truncated, how can I solve this issue? THX!