I am using torch.utils.data.DataLoader to load my data in batches. It loads all data fine. But the batch number is the first parameter in the shape of the torch variable. So when I use GRU or LSTM model they complain that the torch dataset has an extra dimension. Is there a way to avoid this problem? For example my dataset is of shape [4, 1, 6]. But dataloader returns shape as [1, 4, 1, 6] which LSTM or GRU do not like.
Even if say batch_size = 1 it still adds a 1 to the shape like [1, 4, 1, 6].
How can I get the size of my dataset to be [4, 1, 6] instead of [1, 4, 1, 6]?