Bahdanau Attention on 1d input

Hi, I want to apply Bahdanau atetntion on my two inputs, my input data has not sequence length. How can I apply in a correct way?

input1 = torch.randn(128, 64)
input2 = torch.randn(128, 64)
in my input, 128 represent the total number of training samples and 64 is the feature dimension. I want to apply Bahdanau attention on these two inputs.
I am applying this way, but it is not giving me output of shape (1,64). 
class BahdanauAttention(nn.Module):
    def __init__(self, hidden_size):
        super(BahdanauAttention, self).__init__()
        self.Wa = nn.Linear(hidden_size, hidden_size)
        self.Ua = nn.Linear(hidden_size, hidden_size)
        self.Va = nn.Linear(hidden_size, 1)

    def forward(self, query, keys):
        scores = self.Va(torch.tanh(self.Wa(query) + self.Ua(keys)))
        print(scores.shape)
        scores = scores.squeeze(1).unsqueeze(0)
        print(scores.shape)
        weights = F.softmax(scores, dim=1)
        context = torch.mm(weights, keys)

        return context, weights