Tied "conv1d" and "conv_transpose1d" not geting the same result as the input

Here is a snippet of my test code:

    data = torch.tensor([[[1.], [2.], [3.], [4.]]])  # N=1 * L=4 * C_in=1
    data = data.permute(0, 2, 1)
    conv = nn.Conv1d(1, 1, 2, bias=None)
    print(conv.weight)
    output = conv(data)
    print("output : {}".format(output))
    dweight = conv.weight.transpose(0, 1).flip(2, )   # dweight = conv.weight.transpose(0, 1)
    print("dweight : {}".format(dweight))

    dconv = F.conv_transpose1d(input=output, weight=dweight, bias=None)
    print(dconv)   # != data

The result I suppose it would be is: input -> conv1d(input) -> **conv_transpose1d(conv1d(input)) should be equal to input.

But they are not equal, whether I flip the temporal axis or not, but they are supposed to be the same, right?

I’m really confused and frustrated here, could anyone figure it out?

Please don’t tag specific people, as this might discourage others to answer.

I think I misunderstand the “tied weight” concept.

I wrote the conv_transposed1d in doubly block circulant matrix form and I find that one don’t need to flip the temporal axis actually.

Suppose the conv1d’s matrix is and the corresponding conv_transpose1d’s matrix is .

The square matrix apprently is not always identity matrix. So the result need not to be identical to the input.