Elementwise multiplication on an image inside neural network

I am trying to train a neural network for which the inputs and outputs are both 2D numerical arrays. I want the last layer to be an ‘element-wise multiplying layer’, i.e, If the input for the layer is Aij, then I want the output should be Wij*Aij + Bij, where Wij and Bij are the weight and bias tensors of the last layer.

I wrote down how I am trying to do this below. Seems inefficient and the loss is not going down with each iteration. Is there a clever/efficient way of doing this better ?
At an even more elementary level, will autograd update the gradients dL/dWij and dL/dBij. I think it should. Am I missing something ?

def init(self):
self. convs = some convolutions
self.LastLayerWeights = nn.Conv2d(1,1, Arows, Acols)
self.LastLayerBias = nn.Conv2d(1,1, Arows, Acols)

def forward(self,x):
x = Some activations after self.convs
x = self.LastLayerWeights.weight * x + self.LastLayerBias.bias
return x

try using nn.Parameter : Parameter — PyTorch 2.3 documentation

your code would be like

    self.LastLayerWeights = nn.Parameter(torch.randn(1, 1, Arows, Acols))
    self.LastLayerBias = nn.Parameter(torch.randn(1, 1, Arows, Acols))

    x = self.LastLayerWeights * x + self.LastLayerBias

Works much better. Thanks