Using tensor data instead of ImageSetFolder to feed into ESRGAN

Instead of working with image files such as .png and .jpg, I’ve extracted three variables u, v, w from a NETCDF file, with dimension as (hours, number of grid points in X, number of grid points in Y). The u,v,w components can be seen as three separate channels, just like an image has RGB channels. Afterwards, I transform all the components into tensors using torch.from_numpy, and reshape them into (hours, channel, gridX, gridY), and concatenate them to get my data. Thus the tensor dimension is (745, 3, 132, 132).

Now I want to plug it in an ESRGAN, .
The problem is, the dataloader is using ImageSetFolder class, and take images on the form .png as inputs. In PyTorch we usually use dataloader ImageSetfolder which loads the images automaticailly. However, now my data is already given in tensor form. What I can do from here? For practical use, the tensor data can be treated as RGB images. Thinking about maybe utilizing TensorData class? Or just removing the whole dataloader and sample from my dataset or something.

Help appreciated!

I’m not sure if the posted shape ([745, 3, 132, 132]) is a single sample or your complete dataset.
In the latter case, you could simply use a TensorDataset, pass it to a DataLoader, and remove the currently used Dataset from the code.

1 Like

It is the latter case. But how will I change the code, as it expects me to pass some images in a folder in order to use dataroot? I also have to normalize it, thinking about creating my own transform or something like that. I will keep you further updated.

I’m not sure which script you are using, but instead of loading the images you would just replace these lines with your loaded tensors.
What kind of transformations would you like to apply?

By transformation I meant normalization, but I fixed that. I’m trying to fit my tensor data into a dcgan, using this as a basis .
However I will make a new discussion thread as I now have a more clear understanding of what I’m asking for, so you can gladly delete this discussion.