I have a tensor `R`

containing distances to the center position and a parameter `r_max`

defining the maximum distance. I want to generate a mask like `R.lt(r_max)`

and get the gradient wrt `r_max`

. How should I generate the mask tensor to make this possible? I tried `lt()`

and `torch.where()`

and both failed.

If you want a boolean mask (e.g., as returned by `lt`

), that is not floating type so you won’t be able to compute gradient of it wrt to anything. If you do something like `clamp_max`

, you could get gradients wrt `R`

and` r_max`

, but not sure if that’s what you need.

I will use the mask to do calculation so I think `clamp`

can solve my problem. Thanks a lot.

I’m sorry. It seems that `clamp`

only accepts `Number`

as `min/max`

which can not be differentiable.

I found a way to make the ‘r_max’ differentiable.

```
import torch
x = torch.linspace(-20, 19, 40)
Y, X = torch.meshgrid(x, x)
R = (X**2 + Y**2)**0.5
r_max = torch.tensor(15.0, requires_grad=True)
R_norm = R / r_max
mask = R_norm.where(R_norm>1, torch.zeros_like(r))
mask = mask.where(R_norm<1, torch.ones_like(r))
```

thanks for sharing! i had the same question

Sorry for ignoring the difference between `tensor.where(condition, y)`

and `np.where(condition, y)`

. In order to get results like `R.lt(r_max)`

, the code should be:

```
mask = R_norm.where(R_norm>1, torch.ones_like(r))
mask = mask.where(R_norm<1, torch.zeros_like(r))
```