Padding values ignored by convlayers or PackedSequence for CNNs?

Hey,

I have a custom dataloader, that provides a different number of inputs to the network for each __getitem__ call in form of a dict:

item.keys() = [“image”, “ground_truth”, “data_tensor_1”, “data_tensor_2”, “data_tensor_3”, …]

So for each item, the actual number of data_tensor-dict-entries can differ. (it does not necessarily need to be saved as a dict, but so far this was the easiest to perform other things in my code)

In the forward pass of my network (fully conv), one part is called for each of the data_tensors. So for example, if the item contains 3 data_tensors, that part is run through 3 times. This works nicely if each dataset item has the same number of elements. However, with the configuration I have now, a batch does not have a consistent number of elements so within the same batch, this part of the network would have to be run a different number of times depending on the batch entry.

One solution I could think of was padding all items of a batch with None-data_tensors such that the whole batch looks the same. But if I then run a tensor full of None through a convlayer, does this have an influence on the model? Basically, I would like to have a padding that’s completely ignored by my network / conv layers.

Another thing I was thinking of is using something like PackedSequence for RNNs. Is there a way of using them for conv layers?

I hope I was able to describe my problem in an understandable way.

Thanks a lot for any help!