Fashion MNIST - ANN

Hi,

For a university project, we have to create a certain accuracy level for the famous Fashion MNIST with model of neural network.

I am having some issues to somehow connect the test version to the train version. I am trying to save a the best version to then load it again to evaluate:

I am trying to do this code but I can’t get it right.

model = FashionMNISTNet(*images, **labels)
model.load_state_dict(torch.load(os.path.join))
model.eval()
TypeError Traceback (most recent call last)
in ()
----> 1 model = FashionMNISTNet(*images, **labels)
2 model.load_state_dict(torch.load(os.path.join))
3 model.eval()

TypeError: type object argument after ** must be a mapping, not Tensor

Is FashionMNISTNet a subclass of torch.nn.Module? If so it is a bit strange to pass images and labels to the __init__ function as usually the model definition itself doesn’t need to know anything about the data.

Can you show what the definition of FashionMNISTNet looks like?
The Imagenet Example also gives an example of how to use the model loading functions.

Sure, thank you for answer.

This is where I took the FashionMNISTNet

implement the MNISTNet network architecture

class FashionMNISTNet(nn.Module):

# define the class constructor

def __init__(self):

    

    # call super class constructor

    super(FashionMNISTNet, self).__init__()

    

    # specify fully-connected (fc) layer 1 - in 28*28, out 100

    self.linear1 = nn.Linear(28*28, 100, bias=True) # the linearity W*x+b

    self.relu1 = nn.ReLU(inplace=True) # the non-linearity 

    self.linear2 = nn.Linear(100, 75, bias=True)

    self.relu2 = nn.ReLU(inplace=True)

    

    # specify fc layer 2 - in 75, out 50

    self.linear3 = nn.Linear(75, 50, bias=True) # the linearity W*x+b

    self.relu3 = nn.ReLU(inplace=True) # the non-linarity

    

    # specify fc layer 3 - in 50, out 10

    self.linear4 = nn.Linear(50, 10) # the linearity W*x+b

    

    # add a softmax to the last layer

    self.softmax = nn.Softmax(dim=1) # the softmax

    

# define network forward pass

def forward(self, images):

    

    # reshape image pixels

    x = images.view(-1, 28*28)

    

    # define fc layer 1 forward pass

    x = self.relu1(self.linear1(x))

    

    # define fc layer 2 forward pass

    x = self.relu2(self.linear2(x))

    # define fc layer 3 forward pass

    x = self.relu3(self.linear3(x))

    # define layer 3 forward pass

    x = self.softmax(self.linear4(x))

    

    # return forward pass result

    return x

It seems like you shouldn’t need to pass images or labels to the constructor; you can just call FashionMNISTNet with any arguments (e.g., model = FashionMNISTNet()).