How to apply dropout to drop an entire vector

I am concatenating two vectors, say A and B, into a single vector. Each of them have a dimension of 128. When training I would like randomly drop one of them completely. How can I do that?

You can simply define a dropout layer and pass the tensor to it. For example -

a = torch.rand(1, 5)
print(a) # output is tensor([[0.3119, 0.1485, 0.6420, 0.4604, 0.0724]])
dropout = torch.nn.Dropout(1)
print(dropout(a))
# output is tensor([[0., 0., 0., 0., 0.]])

@a_d But it always drops that vector.

Yes the dropout layer will always make everything zero as it is set to 1.
To randomly drop off the layer, generate a random number between 0 and 1 and if the number is greater than the threshold drop it

prob  = random.uniform(0, 1)
if prob > threshold:
    b = dropout(b)

Then you can concatenate the two vectors.

If you want to stack these tensors together, you could use something like this:

a = torch.randn(128)
b = torch.randn(128)
c = torch.stack((a, b)) * torch.randperm(2).unsqueeze(1)

which would zero out one of these tensors completely.
If that’s not your use case, could you explain the shapes a bit more?