Prepending layer to the input layer

Given a neural network, with 10 neurons in the input layer, 3 in the hidden and 1 in the output layer, is there a way to prepend a layer to the input layer?

For instance, I have a dataset of five 2D tensors (3x10) with one label value (for instance, the task is regression) for each.

I want to create a 1D tensor (dimension size=10) and then pass this to the input layer with 10 neurons. In order to create a 1D tensor, each row in a 2D tensor is multiplied by a value in weight tensor of size 3. This creates three weighted tensors which are finally added to create a final 1d tensor of size 10, then fed to the input layer.

My question is how to prepend this weight tensor and make sure the values are updated.


Based on your description, it seems you would like to create a weight parameter, which will scale the inputs, and which should be trainable.
If that’s the case, you could initialize it as:

weight = nn.Parameter(torch.randn(3).to(device))

and pass it with all other parameters to the optimizer:

optimizer = optim.Adam([weight] + list(model.parameters()), lr=1e-3)

This will make sure to optimizer all passed parameters.
Autograd will take care of creating the computation graph based on the applied operations.

Thanks a lot for your reply. It does partially answer my question. I have attached an image of what I am intending to achieve.

There are 5 document tensors (each is 3 x 10).

  1. I want to apply weights to individual rows
  2. Sum these weighted rows to get a document vector which then goes as input to the neural network.
  3. At the same time, I want the initial weights w to be updated too.

Can you please help me figure this out, thanks?