How to normalize a tensor to have values between a and b

If i have tensor A = torch.rand(30,500,50,50) what is the smartest and fastest way to normalize each layer (the layers in A.size(1)) to have values between a and b.
The naive way is:

B = torch.zeros(A.size())

for b in range(A.size(0)):
            for c in range(A.size(1)):
                B[b,c,:,:] = ((b-a)*(A[b,c,:,:]-torch.min(A[b,c,:,:]))/(torch.max(A[b,c,:,:])-torch.min(A[b,c,:,:]))) + a

But it is super slow…

I haven’t timed the code yet, but it should be faster than for loops:

x1 = torch.randn(30, 500, 50, 50)
x1_min = torch.min(x, dim=3, keepdim=True)[0].min(2, keepdim=True)[0]
x1_max = torch.max(x, dim=3, keepdim=True)[0].max(2, keepdim=True)[0]

x2 = (g-f)*(x1 - x1_min) / (x1_max - x1_min) + f
2 Likes

Thanks!
I always learn new stuff from your code!:raising_hand_man:t2::pray:t2:

1 Like