I am learning PyTorch and tested some neural networks with MINST data.
Now I go for a test against custom datasets by referring to MINST data structure, but I cannot see the structure.
I never did this kind of loading are you trying to do…
The thing is the data loader creates a iterator then you must iterate on it …
for data_x, data_y in dataloader:
pass
Take a look on this tutorial how to create a Dataset Class and use that to wrap your data.
It must implement the following methods :
class CustomDataset(Dataset):
def __init__(self):
""" load your data here """
pass
def __len__(self):
""" return the number of observation here """
pass
def __getitem__(self, index):
""" receives a index / range here and returns accordingly """
pass
Then you create a instance of the dataset and pass it to the your dataloader.
Actually, I showed the image from MNIST data in the iteration and I confirmed that
every unit from the data in fact has a label and image.
However, because I cannot grasp with what format the dataset is formed from the binary(?) data,
it is difficult for me to create another dataset with the same format of MNIST.
Your another suggestion is also very helpful for me.
On the next step, I will see the tutorial and try to understand how should I create a Dataset class and load the data with it.
If you would like to create another Dataset with the same structure as MNIST, e.g. as a replacement, have a look at torchvision.datasets.MNIST. There you will see which data is downloaded, unzipped, and how the binary data is processed.
Maybe that helps.
Hi again, sorry I don’t get earlier that you were referring to the real structure of the data itself.
Look for the subject on the page bellow : FILE FORMATS FOR THE MNIST DATABASE
There you can understand better how the data is hold inside the files to create yours.
Thank you again for your reply!
It’s all right, of course, or rather I really appreciate your replies.
This page seems to be fundamental for a better understanding MNIST dataset as you say.
I will try it again.
I couldn’t find this page without your help.
Thank you very much:)