How can I perform element wise multiplication of two tensors?


(Kina) #1

Given A tensor a with shape is (128,1,143) and another tensor b with shape of (128,80,134), and I want to perform an element wise multiplication. Then i tried the following ways:

  1. a * b but it displayed an error of dimension mismatch
  2. b * a.expand_as(b) but it also come up with error that dimension mismatch

can any one suggest me a solutions?


(Alban D) #2

Hi,

Your tensors don’t have the same size. How do you match the elements to do element-wise operations?


(Kina) #3

Thank you for your response!
Is there any way to make the tensors to have the same size, either a become similar size with b or b become similar size with a?


(Alban D) #4

Well that depends where they come from and what they contain?
The size should have a meaning, you need to check what is this meaning in your case.


(Kina) #5

I apply the conv1d to speech recognition, the input is 13 dimensional fbank features, before providing the input to conv layer, i used x=x.view(batch, 1, seq_len), with batch size is 128, out channel is 1 and seq_len is 143 . Then apply conv with a filter size of 80 and got the tensor shape of (128, 80, 134). Here i like to add highway component to have highwayconv. Thus, I apply the sigmoid operation on the output of conv and get k, the apply element wise multiplication operation of k with conv output and get the transformation gate t shape is similar with conv. Next, subtraction operation of k from 1 and got tensor c with shape similar to conv, finally, when i do element wise multiplication operation of tensor c with the input x the dimension miss match occur?
Look the dimension sequence of the above operations:

  1. 128,1,143 [input -x]
  2. 128,80,134 [conv operation output tensor]
  3. 128,80,134 [size of k tensor]
  4. 128,80,134 [size of t tensor]
  5. 128,80,134 [size of c tensor ]
    When the sixth operation is fails due to the size miss match. Thus, i want to make the two tensors to have similar size and perform element wise multiplication?

(Alban D) #6

It looks like you’re missing padding in your convolution to make sure that the output sequence length is the same as the output sequence length. For that you need a padding equal to half the kernel size.
You will still have a mismatch in size in the channel dimension, from what I remember, highway component are usually used for conv that have the same number of input and output channels to avoid this issue. In your case, I guess you can expand the input to be same size as the output if you really want to do this.