I have an input that its size is varied e.g. (102 x 128) or (102 x 1100).
I create a network that is theoretically valid on this different size of input as follows
(In my model class, I define forward method as follows)
def forward(self, inputs):
x = self.Batchnorm1(inputs)
x, _ = self.LSTM(x)
x, _ = torch.max(x, 1)
x = self.linear(x)
x = torch.sigmoid(x)
That is by using global max pooling, the dimension will match.
Besides, I also make some customization on the dataloader for training different size of inputs in the batch according to https://jdhao.github.io/2017/10/23/pytorch-load-data-and-make-batch/
However, the problem still persists. That is the output from new customized collate_fn which is as fllows
def my_collate(batch):
data = [item[0] for item in batch] # just form a list of tensor
target = [item[1] for item in batch]
target = torch.LongTensor(target)
return [data, target]
will still output the type of input data that I cannot feed into batch normalization (or other types of layers) in my forward function since it expects 3-d Tensor (Batch Size x dim1x dim2)
How should I make any changes to my model to be able to receive the input.
ps. padding is not an option here