Hello,
I have the following setup for my data
data/
- 0000/
- 0000.png
- 00001.png
- ...
- 0001/
- 0000.png
- ...
- ...
I’m trying to load 5 sequential frames so that one batch would return the following:
batch0:
- 0000/0000.png, 0000/0001.png, ..., 0000/N.png
- 0000/0001.png, 0000/0002.png, ..., 0000/N+1.png
I’m quite new to PyTorch and I feel a bit lost in different alternatives. I’ve looked at the following threads, but haven’t really found anything that hits home:
In the end I want to stack the images in a tensor on top of each other, to the dimension (WxHxDxN). Can I combine the dataset’s __getitem__
and the sampler to make sure that the first sample contains the N
first images and the last sample containts the N
last images? My spontaneous feeling is that I should yield the tensor in __getitem__
, thus being able to easily create batches of “sequences” as the sequences are already stacked, but how can one create magic with restricting the indices?.
.@ptrblck answers in the first and second link are definitely interesting, and I’ve looked at using SequentialSampler/BatchSampler, but in this case it doesn’t solve the problem of “starting” at the Nth image.
Thanks,
Erik