Is it possible to implement a element wise multiplication as a convolution layer?

Hey everyone, I was curious if it was possible to implement an elementwise multiplication as a convolutional layer or as a fully connected layer for example.

I have tried the following but it does not appear to give the correct output:

import torch
import torch.nn as nn

# we create a pytorch conv2d to act as an element wise matrix multiplication and compare it to a standard matrix multiplication
batch_size = 1
input_channels = 1
input_height = 3
input_width = 3
output_channels = 1
kernel_size = 3

#create a random input tensor
input = torch.randn(input_height, input_height, input_channels, batch_size)
reginput = torch.squeeze(input)

#define the convolutional layer
conv_layer = nn.Conv2d(input_height, input_height, bias=False, kernel_size= (1,1), stride= (input_height,input_width))

#create a second tensor to act as the kernel
weights = torch.tensor([[[[1, 2, 3], [4, 5, 6], [7, 8, 9]]]], dtype=torch.float32)

weightsreg = torch.squeeze(weights)

weightsconv = weights.view(input_height, input_height, 1, 1) = weightsconv

#perform the convolution
output = conv_layer(input)

#perform the matrix multiplication
output2 = reginput*weightsreg

#compare the outputs

When run it can be seen that the output does not equal the regual elementwise matrix multiplication. Is there anything I am missing? Thank you in advance

It is doable trivially in the case of a 1x1 input with N channels if you use depthwise convolution:

import torch

N = 32
a = torch.randn(N)
b = torch.randn(N)
out1 = torch.nn.functional.conv2d(a.reshape(1, N, 1, 1), b.reshape(N, 1, 1, 1), groups=N).reshape(N)
out2 = a * b
print(torch.allclose(out1, out2))