Feed forward entire dataset on model

For my project I want to run some experiments after certain epochs have finished training. For example, after the 3rd epoch I want to get the logit values of a network pre-softmax, so I need to run my entire dataset (MNIST in this case) through my model, just forward. This seems it lends itself to parallelization, is there a way to run the entire model on my data array without a for loop?

Does you model have a softmax at the end? If not, you should get logit values from the final FC layer already. If you do have a softmax, you can add a forward hook after the final FC layer to get the activations from that before the softmax.

You should be able to get the outputs from your model without a for loop by just providing all the inputs you want the model to classify.

outputs = model(inputs)

You can then use outputs for whatever you want.

1 Like

Thank you. When I try to pass my dataset into model() it says I cannot pass in an MNIST object (it must be a tensor). Do I have to make a separate dataloader for this, or is there a quick way to make my dataset into a tensor?

Yes you have to use a dataloader whenever you want to provide input to your model. Your dataloader is setup with a given batch size so each time you will get a tensor with that many inputs and labels.

for images, labels in testloader:
    outputs = model(images)