Hi,
let’s say I have a Tensor X with dimensions [batch, channels, H, W]
then I have another tensor b that holds bias values for each channel which has dims [channels,]
I want y = x + b
Is there a nice way to broadcast this over H and W for each channel for each sample in the batch without using a loop.
If i’m convolving I know I can use the bias field in the function to achieve this, but I’m just wondering if this can be achieved just with primitive ops (not using explicit looping)
Ah ok.
basically, we use None in the array index to introduce a singleton dimension. (similar to tensor.unsqueeze(dim=N) for some N).
When you use x[None,:, None, None], three more singleton dimensions will be introduced in x and it will make the operation as an addition between the 7-dimensional tensor and 1-D tensor.