Clamp a Tensor dimension-wise

Hi,

say x is a tensor with shape (M, N) while min and max have shape (M,).

I basically want to do:

for m in range(M):
    torch.clamp(x[m, :], min=min[m], max=max[m], out=x[m,:])

That would be applying clamp dimension-wise. Can I avoid the for loop ?

thanks

answering my own question:

torch.min(
   torch.max(x, min[:, None]),
   max[:, None])

less elegant than a call to clamp but works fine

A more elegant way would be:

x.clamp(min=min_tensor.unsqueeze(-1), max=max_tensor.unsqueeze(-1))
1 Like