Hey everyone, I was curious if it was possible to implement an elementwise multiplication as a convolutional layer or as a fully connected layer for example.
I have tried the following but it does not appear to give the correct output:
import torch
import torch.nn as nn
# we create a pytorch conv2d to act as an element wise matrix multiplication and compare it to a standard matrix multiplication
batch_size = 1
input_channels = 1
input_height = 3
input_width = 3
output_channels = 1
kernel_size = 3
#create a random input tensor
input = torch.randn(input_height, input_height, input_channels, batch_size)
print(input.shape)
reginput = torch.squeeze(input)
print(reginput.shape)
#define the convolutional layer
conv_layer = nn.Conv2d(input_height, input_height, bias=False, kernel_size= (1,1), stride= (input_height,input_width))
#create a second tensor to act as the kernel
weights = torch.tensor([[[[1, 2, 3], [4, 5, 6], [7, 8, 9]]]], dtype=torch.float32)
weightsreg = torch.squeeze(weights)
weightsconv = weights.view(input_height, input_height, 1, 1)
conv_layer.weight.data = weightsconv
#perform the convolution
output = conv_layer(input)
#perform the matrix multiplication
output2 = reginput*weightsreg
#compare the outputs
print(output)
print(output2)```
When run it can be seen that the output does not equal the regual elementwise matrix multiplication. Is there anything I am missing? Thank you in advance