Loss Rate seems way to high

Hello I have been learning about pytorch. I have made a couple of functional models with the MNIST and Fashion MNIST datasets. I am now trying to make a more general purpose one. For this I am taking a pretrained vgg model

model = models.vgg11(pretrained=True)
#don't need to retrain the vgg part
for param in model.parameters():
   param.requires_grad = False

and making my own classifier to put at the end of it.

classifier = nn.Sequential(OrderedDict([
    ('fc1', nn.Linear(class_in, class_hd_szs[0])),
    ('ReLU1', nn.ReLU()),
    ('fc2', nn.Linear(class_hd_szs[0], class_out_sz)),
    
    ('softmax', nn.LogSoftmax(dim=1))
]))
model.classifier = classifier

My model is to identify >100 plant species common in UK http://www.robots.ox.ac.uk/~vgg/data/flowers/102/index.html. when I attempt to train the model with the following procedure

epochs = 3
print_on = 5
steps = 0
model.to('cuda')
for e in range(epochs):
    running_loss = 0
    criterion = nn.NLLLoss()
    optimizer = optim.Adam(model.classifier.parameters(), lr=0.001)
    
    for i, (images, labels) in enumerate(dataloaders['train']):
        steps += 1
        images, labels = images.to('cuda'), labels.to('cuda')
        optimizer.zero_grad()
        output = model.forward(images)
        loss = criterion(output, labels)
        
        loss.backward()
        optimizer.step()
        running_loss += loss.item()
    
        if (steps % print_on == 0):
            print("on {} epoch loss was {:.4f}".format(e+1, running_loss/print_on))
            running_loss = 0

my loss seems way to high. With this exact layout it starts as high as 25.0 and never gets down below 4.0 no matter how many epochs I do. Am I correct in assuming that I am missunderstanding something. I was under the belief that the loss reflected the % of wrong outputs. Does anyone know what I am doing wrong? Is it a problem of classifier model layout that I just need to tweak more or have I coded my training in the wrong way? Thank you for any advise or ideas anyone can provide.