aliutkus
(Antoine Liutkus)
1
Hi,
say x
is a tensor with shape (M, N) while min
and max
have shape (M,).
I basically want to do:
for m in range(M):
torch.clamp(x[m, :], min=min[m], max=max[m], out=x[m,:])
That would be applying clamp
dimension-wise. Can I avoid the for loop ?
thanks
aliutkus
(Antoine Liutkus)
2
answering my own question:
torch.min(
torch.max(x, min[:, None]),
max[:, None])
less elegant than a call to clamp
but works fine
amitabe
(Amit)
3
A more elegant way would be:
x.clamp(min=min_tensor.unsqueeze(-1), max=max_tensor.unsqueeze(-1))
1 Like