Vectorized power function

I have a single tensor where all elements need to be elevated to each power contained in another tensor.

Let’s say I have tensor A with shape [100], and tensor B with shape [64, 5]. Tensor A is my initial tensor, and tensor B contains a 64-batch of 5 elements.

Tensor C should be [torch.pow(A, exp) for exp in batch for batch in B], with shape [64, 100, 5]. torch.pow does not allow this natively (it does element-by-element exponentiation if I feed it a vector as exp as described in the docs). How I do it now is with the aforementioned double for-loop, but it is not efficient. Is there a way to vectorize this?

Hi Francesco!

Use pytorch’s broadcasting with unsqueeze() to line the dimensions
up appropriately:

torch.pow (A.unsqueeze (-1), B.unsqueeze (1))
# or
torch.pow (A.unsqueeze (0).unsqueeze (-1), B.unsqueeze (1))


K. Frank